build based on 1dcdf66

This commit is contained in:
autodocs 2017-02-28 16:21:45 +00:00
parent a4156ffbc5
commit 77d1931aee
13 changed files with 304 additions and 15 deletions

View File

@ -113,6 +113,11 @@ In Action
Logistic Regression
</a>
</li>
<li>
<a class="toctext" href="../examples/char-rnn.html">
Char RNN
</a>
</li>
</ul>
</li>
<li>
@ -140,7 +145,7 @@ Backends
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/50d36079eb711e7ea21b0c13ed40a8132e9cccfe/docs/src/apis/backends.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/1dcdf666517a8b5e1ecf1e4bb685fa3a731cf5c0/docs/src/apis/backends.md">
<span class="fa">
</span>

View File

@ -118,6 +118,11 @@ In Action
Logistic Regression
</a>
</li>
<li>
<a class="toctext" href="../examples/char-rnn.html">
Char RNN
</a>
</li>
</ul>
</li>
<li>
@ -145,7 +150,7 @@ Batching
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/50d36079eb711e7ea21b0c13ed40a8132e9cccfe/docs/src/apis/batching.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/1dcdf666517a8b5e1ecf1e4bb685fa3a731cf5c0/docs/src/apis/batching.md">
<span class="fa">
</span>

View File

@ -101,6 +101,11 @@ In Action
Logistic Regression
</a>
</li>
<li>
<a class="toctext" href="examples/char-rnn.html">
Char RNN
</a>
</li>
</ul>
</li>
<li class="current">
@ -126,7 +131,7 @@ Contributing &amp; Help
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/50d36079eb711e7ea21b0c13ed40a8132e9cccfe/docs/src/contributing.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/1dcdf666517a8b5e1ecf1e4bb685fa3a731cf5c0/docs/src/contributing.md">
<span class="fa">
</span>
@ -159,7 +164,15 @@ starring the repo
.
</p>
<p>
If you&#39;re interested in hacking on Flux, most of the code is pretty straightforward. Adding new layer definitions or cost functions is simple using the Flux DSL itself, and things like data utilities and training processes are all plain Julia code. The
If you&#39;re interested in hacking on Flux, most of the
<a href="https://github.com/MikeInnes/Flux.jl/tree/master/src">
code
</a>
is pretty straightforward. Adding new
<a href="https://github.com/MikeInnes/Flux.jl/tree/master/src/layers">
layer definitions
</a>
or cost functions is simple using the Flux DSL itself, and things like data utilities and training processes are all plain Julia code. The
<code>compiler</code>
directory is a bit more involved and is documented in
<a href="interals.html">
@ -172,12 +185,12 @@ If you get stuck or need anything, let us know!
</p>
<footer>
<hr/>
<a class="previous" href="examples/logreg.html">
<a class="previous" href="examples/char-rnn.html">
<span class="direction">
Previous
</span>
<span class="title">
Logistic Regression
Char RNN
</span>
</a>
<a class="next" href="internals.html">

View File

@ -0,0 +1,210 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8"/>
<meta name="viewport" content="width=device-width, initial-scale=1.0"/>
<title>
Char RNN · Flux
</title>
<script>
(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
(i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
})(window,document,'script','https://www.google-analytics.com/analytics.js','ga');
ga('create', 'UA-36890222-9', 'auto');
ga('send', 'pageview');
</script>
<link href="https://cdnjs.cloudflare.com/ajax/libs/normalize/4.2.0/normalize.min.css" rel="stylesheet" type="text/css"/>
<link href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/9.5.0/styles/default.min.css" rel="stylesheet" type="text/css"/>
<link href="https://fonts.googleapis.com/css?family=Lato|Ubuntu+Mono" rel="stylesheet" type="text/css"/>
<link href="https://cdnjs.cloudflare.com/ajax/libs/font-awesome/4.6.3/css/font-awesome.min.css" rel="stylesheet" type="text/css"/>
<link href="../assets/documenter.css" rel="stylesheet" type="text/css"/>
<script>
documenterBaseURL=".."
</script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/require.js/2.2.0/require.min.js" data-main="../assets/documenter.js"></script>
<script src="../../versions.js"></script>
<link href="../../flux.css" rel="stylesheet" type="text/css"/>
</head>
<body>
<nav class="toc">
<h1>
Flux
</h1>
<form class="search" action="../search.html">
<select id="version-selector" onChange="window.location.href=this.value">
<option value="#" selected="selected" disabled="disabled">
Version
</option>
</select>
<input id="search-query" name="q" type="text" placeholder="Search docs"/>
</form>
<ul>
<li>
<a class="toctext" href="../index.html">
Home
</a>
</li>
<li>
<span class="toctext">
Building Models
</span>
<ul>
<li>
<a class="toctext" href="../models/basics.html">
Model Building Basics
</a>
</li>
<li>
<a class="toctext" href="../models/templates.html">
Model Templates
</a>
</li>
<li>
<a class="toctext" href="../models/recurrent.html">
Recurrence
</a>
</li>
<li>
<a class="toctext" href="../models/debugging.html">
Debugging
</a>
</li>
</ul>
</li>
<li>
<span class="toctext">
Other APIs
</span>
<ul>
<li>
<a class="toctext" href="../apis/batching.html">
Batching
</a>
</li>
<li>
<a class="toctext" href="../apis/backends.html">
Backends
</a>
</li>
</ul>
</li>
<li>
<span class="toctext">
In Action
</span>
<ul>
<li>
<a class="toctext" href="logreg.html">
Logistic Regression
</a>
</li>
<li class="current">
<a class="toctext" href="char-rnn.html">
Char RNN
</a>
<ul class="internal"></ul>
</li>
</ul>
</li>
<li>
<a class="toctext" href="../contributing.html">
Contributing &amp; Help
</a>
</li>
<li>
<a class="toctext" href="../internals.html">
Internals
</a>
</li>
</ul>
</nav>
<article id="docs">
<header>
<nav>
<ul>
<li>
In Action
</li>
<li>
<a href="char-rnn.html">
Char RNN
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/1dcdf666517a8b5e1ecf1e4bb685fa3a731cf5c0/docs/src/examples/char-rnn.md">
<span class="fa">
</span>
Edit on GitHub
</a>
</nav>
<hr/>
</header>
<h1>
<a class="nav-anchor" id="Char-RNN-1" href="#Char-RNN-1">
Char RNN
</a>
</h1>
<pre><code class="language-julia">using Flux
import StatsBase: wsample
nunroll = 50
nbatch = 50
getseqs(chars, alphabet) = sequences((onehot(Float32, char, alphabet) for char in chars), nunroll)
getbatches(chars, alphabet) = batches((getseqs(part, alphabet) for part in chunk(chars, nbatch))...)
input = readstring(&quot;$(homedir())/Downloads/shakespeare_input.txt&quot;)
alphabet = unique(input)
N = length(alphabet)
Xs, Ys = getbatches(input, alphabet), getbatches(input[2:end], alphabet)
basemodel = Chain(
Input(N),
LSTM(N, 256),
LSTM(256, 256),
Affine(256, N),
softmax)
model = Chain(basemodel, softmax)
m = tf(unroll(model, nunroll))
@time Flux.train!(m, Xs, Ys, η = 0.1, epoch = 1)
function sample(model, n, temp = 1)
s = [rand(alphabet)]
m = tf(unroll(model, 1))
for i = 1:n
push!(s, wsample(alphabet, softmax(m(Seq((onehot(Float32, s[end], alphabet),)))[1]./temp)))
end
return string(s...)
end
sample(basemodel, 100)</code></pre>
<footer>
<hr/>
<a class="previous" href="logreg.html">
<span class="direction">
Previous
</span>
<span class="title">
Logistic Regression
</span>
</a>
<a class="next" href="../contributing.html">
<span class="direction">
Next
</span>
<span class="title">
Contributing &amp; Help
</span>
</a>
</footer>
</article>
</body>
</html>

View File

@ -102,6 +102,11 @@ Logistic Regression
</a>
<ul class="internal"></ul>
</li>
<li>
<a class="toctext" href="char-rnn.html">
Char RNN
</a>
</li>
</ul>
</li>
<li>
@ -129,7 +134,7 @@ Logistic Regression
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/50d36079eb711e7ea21b0c13ed40a8132e9cccfe/docs/src/examples/logreg.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/1dcdf666517a8b5e1ecf1e4bb685fa3a731cf5c0/docs/src/examples/logreg.md">
<span class="fa">
</span>
@ -236,12 +241,12 @@ Previous
Backends
</span>
</a>
<a class="next" href="../contributing.html">
<a class="next" href="char-rnn.html">
<span class="direction">
Next
</span>
<span class="title">
Contributing &amp; Help
Char RNN
</span>
</a>
</footer>

View File

@ -113,6 +113,11 @@ In Action
Logistic Regression
</a>
</li>
<li>
<a class="toctext" href="examples/char-rnn.html">
Char RNN
</a>
</li>
</ul>
</li>
<li>
@ -137,7 +142,7 @@ Home
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/50d36079eb711e7ea21b0c13ed40a8132e9cccfe/docs/src/index.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/1dcdf666517a8b5e1ecf1e4bb685fa3a731cf5c0/docs/src/index.md">
<span class="fa">
</span>

View File

@ -101,6 +101,11 @@ In Action
Logistic Regression
</a>
</li>
<li>
<a class="toctext" href="examples/char-rnn.html">
Char RNN
</a>
</li>
</ul>
</li>
<li>
@ -126,7 +131,7 @@ Internals
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/50d36079eb711e7ea21b0c13ed40a8132e9cccfe/docs/src/internals.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/1dcdf666517a8b5e1ecf1e4bb685fa3a731cf5c0/docs/src/internals.md">
<span class="fa">
</span>

View File

@ -118,6 +118,11 @@ In Action
Logistic Regression
</a>
</li>
<li>
<a class="toctext" href="../examples/char-rnn.html">
Char RNN
</a>
</li>
</ul>
</li>
<li>
@ -145,7 +150,7 @@ Model Building Basics
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/50d36079eb711e7ea21b0c13ed40a8132e9cccfe/docs/src/models/basics.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/1dcdf666517a8b5e1ecf1e4bb685fa3a731cf5c0/docs/src/models/basics.md">
<span class="fa">
</span>

View File

@ -102,6 +102,11 @@ In Action
Logistic Regression
</a>
</li>
<li>
<a class="toctext" href="../examples/char-rnn.html">
Char RNN
</a>
</li>
</ul>
</li>
<li>
@ -129,7 +134,7 @@ Debugging
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/50d36079eb711e7ea21b0c13ed40a8132e9cccfe/docs/src/models/debugging.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/1dcdf666517a8b5e1ecf1e4bb685fa3a731cf5c0/docs/src/models/debugging.md">
<span class="fa">
</span>

View File

@ -102,6 +102,11 @@ In Action
Logistic Regression
</a>
</li>
<li>
<a class="toctext" href="../examples/char-rnn.html">
Char RNN
</a>
</li>
</ul>
</li>
<li>
@ -129,7 +134,7 @@ Recurrence
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/50d36079eb711e7ea21b0c13ed40a8132e9cccfe/docs/src/models/recurrent.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/1dcdf666517a8b5e1ecf1e4bb685fa3a731cf5c0/docs/src/models/recurrent.md">
<span class="fa">
</span>

View File

@ -118,6 +118,11 @@ In Action
Logistic Regression
</a>
</li>
<li>
<a class="toctext" href="../examples/char-rnn.html">
Char RNN
</a>
</li>
</ul>
</li>
<li>
@ -145,7 +150,7 @@ Model Templates
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/50d36079eb711e7ea21b0c13ed40a8132e9cccfe/docs/src/models/templates.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/1dcdf666517a8b5e1ecf1e4bb685fa3a731cf5c0/docs/src/models/templates.md">
<span class="fa">
</span>

View File

@ -101,6 +101,11 @@ In Action
Logistic Regression
</a>
</li>
<li>
<a class="toctext" href="examples/char-rnn.html">
Char RNN
</a>
</li>
</ul>
</li>
<li>

View File

@ -232,6 +232,22 @@ var documenterSearchIndex = {"docs": [
"text": "This walkthrough example will take you through writing a multi-layer perceptron that classifies MNIST digits with high accuracy.First, we load the data using the MNIST package:using Flux, MNIST\n\ndata = [(trainfeatures(i), onehot(trainlabel(i), 0:9)) for i = 1:60_000]\ntrain = data[1:50_000]\ntest = data[50_001:60_000]The only Flux-specific function here is onehot, which takes a class label and turns it into a one-hot-encoded vector that we can use for training. For example:julia> onehot(:b, [:a, :b, :c])\n3-element Array{Int64,1}:\n 0\n 1\n 0Otherwise, the format of the data is simple enough, it's just a list of tuples from input to output. For example:julia> data[1]\n([0.0,0.0,0.0, … 0.0,0.0,0.0],[0,0,0,0,0,1,0,0,0,0])data[1][1] is a 28*28 == 784 length vector (mostly zeros due to the black background) and data[1][2] is its classification.Now we define our model, which will simply be a function from one to the other.m = Chain(\n Input(784),\n Affine(128), relu,\n Affine( 64), relu,\n Affine( 10), softmax)\n\nmodel = tf(model)We can try this out on our data already:julia> model(data[1][1])\n10-element Array{Float64,1}:\n 0.10614 \n 0.0850447\n 0.101474\n ...The model gives a probability of about 0.1 to each class which is a way of saying, \"I have no idea\". This isn't too surprising as we haven't shown it any data yet. This is easy to fix:Flux.train!(model, train, test, η = 1e-4)The training step takes about 5 minutes (to make it faster we can do smarter things like batching). If you run this code in Juno, you'll see a progress meter, which you can hover over to see the remaining computation time.Towards the end of the training process, Flux will have reported that the accuracy of the model is now about 90%. We can try it on our data again:10-element Array{Float32,1}:\n ...\n 5.11423f-7\n 0.9354 \n 3.1033f-5 \n 0.000127077\n ...Notice the class at 93%, suggesting our model is very confident about this image. We can use onecold to compare the true and predicted classes:julia> onecold(data[1][2], 0:9)\n5\n\njulia> onecold(model(data[1][1]), 0:9)\n5Success!"
},
{
"location": "examples/char-rnn.html#",
"page": "Char RNN",
"title": "Char RNN",
"category": "page",
"text": ""
},
{
"location": "examples/char-rnn.html#Char-RNN-1",
"page": "Char RNN",
"title": "Char RNN",
"category": "section",
"text": "using Flux\nimport StatsBase: wsample\n\nnunroll = 50\nnbatch = 50\n\ngetseqs(chars, alphabet) = sequences((onehot(Float32, char, alphabet) for char in chars), nunroll)\ngetbatches(chars, alphabet) = batches((getseqs(part, alphabet) for part in chunk(chars, nbatch))...)\n\ninput = readstring(\"$(homedir())/Downloads/shakespeare_input.txt\")\nalphabet = unique(input)\nN = length(alphabet)\n\nXs, Ys = getbatches(input, alphabet), getbatches(input[2:end], alphabet)\n\nbasemodel = Chain(\n Input(N),\n LSTM(N, 256),\n LSTM(256, 256),\n Affine(256, N),\n softmax)\n\nmodel = Chain(basemodel, softmax)\n\nm = tf(unroll(model, nunroll))\n\n@time Flux.train!(m, Xs, Ys, η = 0.1, epoch = 1)\n\nfunction sample(model, n, temp = 1)\n s = [rand(alphabet)]\n m = tf(unroll(model, 1))\n for i = 1:n\n push!(s, wsample(alphabet, softmax(m(Seq((onehot(Float32, s[end], alphabet),)))[1]./temp)))\n end\n return string(s...)\nend\n\nsample(basemodel, 100)"
},
{
"location": "contributing.html#",
"page": "Contributing & Help",