build based on 5c548fe

This commit is contained in:
autodocs 2017-03-09 00:26:06 +00:00
parent b3f2240130
commit 8c9c2d4ee0
14 changed files with 38 additions and 38 deletions

View File

@ -115,7 +115,7 @@ In Action
<ul>
<li>
<a class="toctext" href="../examples/logreg.html">
Logistic Regression
Simple MNIST
</a>
</li>
<li>
@ -150,7 +150,7 @@ Backends
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/15b3ce1adad9e07367e68c8c3758380b02d6a4a1/docs/src/apis/backends.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/5c548fe1c4c9d23651dd364648cb28fc62b08855/docs/src/apis/backends.md">
<span class="fa">
</span>

View File

@ -120,7 +120,7 @@ In Action
<ul>
<li>
<a class="toctext" href="../examples/logreg.html">
Logistic Regression
Simple MNIST
</a>
</li>
<li>
@ -155,7 +155,7 @@ Batching
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/15b3ce1adad9e07367e68c8c3758380b02d6a4a1/docs/src/apis/batching.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/5c548fe1c4c9d23651dd364648cb28fc62b08855/docs/src/apis/batching.md">
<span class="fa">
</span>

View File

@ -104,7 +104,7 @@ In Action
<ul>
<li>
<a class="toctext" href="../examples/logreg.html">
Logistic Regression
Simple MNIST
</a>
</li>
<li>
@ -139,7 +139,7 @@ Storing Models
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/15b3ce1adad9e07367e68c8c3758380b02d6a4a1/docs/src/apis/storage.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/5c548fe1c4c9d23651dd364648cb28fc62b08855/docs/src/apis/storage.md">
<span class="fa">
</span>
@ -198,7 +198,7 @@ Backends
Next
</span>
<span class="title">
Logistic Regression
Simple MNIST
</span>
</a>
</footer>

View File

@ -103,7 +103,7 @@ In Action
<ul>
<li>
<a class="toctext" href="examples/logreg.html">
Logistic Regression
Simple MNIST
</a>
</li>
<li>
@ -136,7 +136,7 @@ Contributing &amp; Help
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/15b3ce1adad9e07367e68c8c3758380b02d6a4a1/docs/src/contributing.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/5c548fe1c4c9d23651dd364648cb28fc62b08855/docs/src/contributing.md">
<span class="fa">
</span>

View File

@ -103,7 +103,7 @@ In Action
<ul>
<li>
<a class="toctext" href="logreg.html">
Logistic Regression
Simple MNIST
</a>
</li>
<li class="current">
@ -139,7 +139,7 @@ Char RNN
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/15b3ce1adad9e07367e68c8c3758380b02d6a4a1/docs/src/examples/char-rnn.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/5c548fe1c4c9d23651dd364648cb28fc62b08855/docs/src/examples/char-rnn.md">
<span class="fa">
</span>
@ -252,7 +252,7 @@ end</code></pre>
Previous
</span>
<span class="title">
Logistic Regression
Simple MNIST
</span>
</a>
<a class="next" href="../contributing.html">

View File

@ -4,7 +4,7 @@
<meta charset="UTF-8"/>
<meta name="viewport" content="width=device-width, initial-scale=1.0"/>
<title>
Logistic Regression · Flux
Simple MNIST · Flux
</title>
<script>
(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
@ -103,7 +103,7 @@ In Action
<ul>
<li class="current">
<a class="toctext" href="logreg.html">
Logistic Regression
Simple MNIST
</a>
<ul class="internal"></ul>
</li>
@ -135,11 +135,11 @@ In Action
</li>
<li>
<a href="logreg.html">
Logistic Regression
Simple MNIST
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/15b3ce1adad9e07367e68c8c3758380b02d6a4a1/docs/src/examples/logreg.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/5c548fe1c4c9d23651dd364648cb28fc62b08855/docs/src/examples/logreg.md">
<span class="fa">
</span>
@ -149,8 +149,8 @@ Logistic Regression
<hr/>
</header>
<h1>
<a class="nav-anchor" id="Logistic-Regression-with-MNIST-1" href="#Logistic-Regression-with-MNIST-1">
Logistic Regression with MNIST
<a class="nav-anchor" id="Recognising-MNIST-Digits-1" href="#Recognising-MNIST-Digits-1">
Recognising MNIST Digits
</a>
</h1>
<p>
@ -196,7 +196,7 @@ Now we define our model, which will simply be a function from one to the other.
Affine( 64), relu,
Affine( 10), softmax)
model = tf(m)</code></pre>
model = mxnet(m) # Convert to MXNet</code></pre>
<p>
We can try this out on our data already:
</p>

View File

@ -115,7 +115,7 @@ In Action
<ul>
<li>
<a class="toctext" href="examples/logreg.html">
Logistic Regression
Simple MNIST
</a>
</li>
<li>
@ -147,7 +147,7 @@ Home
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/15b3ce1adad9e07367e68c8c3758380b02d6a4a1/docs/src/index.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/5c548fe1c4c9d23651dd364648cb28fc62b08855/docs/src/index.md">
<span class="fa">
</span>

View File

@ -103,7 +103,7 @@ In Action
<ul>
<li>
<a class="toctext" href="examples/logreg.html">
Logistic Regression
Simple MNIST
</a>
</li>
<li>
@ -136,7 +136,7 @@ Internals
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/15b3ce1adad9e07367e68c8c3758380b02d6a4a1/docs/src/internals.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/5c548fe1c4c9d23651dd364648cb28fc62b08855/docs/src/internals.md">
<span class="fa">
</span>

View File

@ -120,7 +120,7 @@ In Action
<ul>
<li>
<a class="toctext" href="../examples/logreg.html">
Logistic Regression
Simple MNIST
</a>
</li>
<li>
@ -155,7 +155,7 @@ Model Building Basics
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/15b3ce1adad9e07367e68c8c3758380b02d6a4a1/docs/src/models/basics.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/5c548fe1c4c9d23651dd364648cb28fc62b08855/docs/src/models/basics.md">
<span class="fa">
</span>

View File

@ -104,7 +104,7 @@ In Action
<ul>
<li>
<a class="toctext" href="../examples/logreg.html">
Logistic Regression
Simple MNIST
</a>
</li>
<li>
@ -139,7 +139,7 @@ Debugging
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/15b3ce1adad9e07367e68c8c3758380b02d6a4a1/docs/src/models/debugging.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/5c548fe1c4c9d23651dd364648cb28fc62b08855/docs/src/models/debugging.md">
<span class="fa">
</span>

View File

@ -104,7 +104,7 @@ In Action
<ul>
<li>
<a class="toctext" href="../examples/logreg.html">
Logistic Regression
Simple MNIST
</a>
</li>
<li>
@ -139,7 +139,7 @@ Recurrence
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/15b3ce1adad9e07367e68c8c3758380b02d6a4a1/docs/src/models/recurrent.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/5c548fe1c4c9d23651dd364648cb28fc62b08855/docs/src/models/recurrent.md">
<span class="fa">
</span>

View File

@ -120,7 +120,7 @@ In Action
<ul>
<li>
<a class="toctext" href="../examples/logreg.html">
Logistic Regression
Simple MNIST
</a>
</li>
<li>
@ -155,7 +155,7 @@ Model Templates
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/15b3ce1adad9e07367e68c8c3758380b02d6a4a1/docs/src/models/templates.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/5c548fe1c4c9d23651dd364648cb28fc62b08855/docs/src/models/templates.md">
<span class="fa">
</span>

View File

@ -103,7 +103,7 @@ In Action
<ul>
<li>
<a class="toctext" href="examples/logreg.html">
Logistic Regression
Simple MNIST
</a>
</li>
<li>

View File

@ -234,18 +234,18 @@ var documenterSearchIndex = {"docs": [
{
"location": "examples/logreg.html#",
"page": "Logistic Regression",
"title": "Logistic Regression",
"page": "Simple MNIST",
"title": "Simple MNIST",
"category": "page",
"text": ""
},
{
"location": "examples/logreg.html#Logistic-Regression-with-MNIST-1",
"page": "Logistic Regression",
"title": "Logistic Regression with MNIST",
"location": "examples/logreg.html#Recognising-MNIST-Digits-1",
"page": "Simple MNIST",
"title": "Recognising MNIST Digits",
"category": "section",
"text": "This walkthrough example will take you through writing a multi-layer perceptron that classifies MNIST digits with high accuracy.First, we load the data using the MNIST package:using Flux, MNIST\n\ndata = [(trainfeatures(i), onehot(trainlabel(i), 0:9)) for i = 1:60_000]\ntrain = data[1:50_000]\ntest = data[50_001:60_000]The only Flux-specific function here is onehot, which takes a class label and turns it into a one-hot-encoded vector that we can use for training. For example:julia> onehot(:b, [:a, :b, :c])\n3-element Array{Int64,1}:\n 0\n 1\n 0Otherwise, the format of the data is simple enough, it's just a list of tuples from input to output. For example:julia> data[1]\n([0.0,0.0,0.0, … 0.0,0.0,0.0],[0,0,0,0,0,1,0,0,0,0])data[1][1] is a 28*28 == 784 length vector (mostly zeros due to the black background) and data[1][2] is its classification.Now we define our model, which will simply be a function from one to the other.m = Chain(\n Input(784),\n Affine(128), relu,\n Affine( 64), relu,\n Affine( 10), softmax)\n\nmodel = tf(m)We can try this out on our data already:julia> model(data[1][1])\n10-element Array{Float64,1}:\n 0.10614 \n 0.0850447\n 0.101474\n ...The model gives a probability of about 0.1 to each class which is a way of saying, \"I have no idea\". This isn't too surprising as we haven't shown it any data yet. This is easy to fix:Flux.train!(model, train, test, η = 1e-4)The training step takes about 5 minutes (to make it faster we can do smarter things like batching). If you run this code in Juno, you'll see a progress meter, which you can hover over to see the remaining computation time.Towards the end of the training process, Flux will have reported that the accuracy of the model is now about 90%. We can try it on our data again:10-element Array{Float32,1}:\n ...\n 5.11423f-7\n 0.9354 \n 3.1033f-5 \n 0.000127077\n ...Notice the class at 93%, suggesting our model is very confident about this image. We can use onecold to compare the true and predicted classes:julia> onecold(data[1][2], 0:9)\n5\n\njulia> onecold(model(data[1][1]), 0:9)\n5Success!"
"text": "This walkthrough example will take you through writing a multi-layer perceptron that classifies MNIST digits with high accuracy.First, we load the data using the MNIST package:using Flux, MNIST\n\ndata = [(trainfeatures(i), onehot(trainlabel(i), 0:9)) for i = 1:60_000]\ntrain = data[1:50_000]\ntest = data[50_001:60_000]The only Flux-specific function here is onehot, which takes a class label and turns it into a one-hot-encoded vector that we can use for training. For example:julia> onehot(:b, [:a, :b, :c])\n3-element Array{Int64,1}:\n 0\n 1\n 0Otherwise, the format of the data is simple enough, it's just a list of tuples from input to output. For example:julia> data[1]\n([0.0,0.0,0.0, … 0.0,0.0,0.0],[0,0,0,0,0,1,0,0,0,0])data[1][1] is a 28*28 == 784 length vector (mostly zeros due to the black background) and data[1][2] is its classification.Now we define our model, which will simply be a function from one to the other.m = Chain(\n Input(784),\n Affine(128), relu,\n Affine( 64), relu,\n Affine( 10), softmax)\n\nmodel = mxnet(m) # Convert to MXNetWe can try this out on our data already:julia> model(data[1][1])\n10-element Array{Float64,1}:\n 0.10614 \n 0.0850447\n 0.101474\n ...The model gives a probability of about 0.1 to each class which is a way of saying, \"I have no idea\". This isn't too surprising as we haven't shown it any data yet. This is easy to fix:Flux.train!(model, train, test, η = 1e-4)The training step takes about 5 minutes (to make it faster we can do smarter things like batching). If you run this code in Juno, you'll see a progress meter, which you can hover over to see the remaining computation time.Towards the end of the training process, Flux will have reported that the accuracy of the model is now about 90%. We can try it on our data again:10-element Array{Float32,1}:\n ...\n 5.11423f-7\n 0.9354 \n 3.1033f-5 \n 0.000127077\n ...Notice the class at 93%, suggesting our model is very confident about this image. We can use onecold to compare the true and predicted classes:julia> onecold(data[1][2], 0:9)\n5\n\njulia> onecold(model(data[1][1]), 0:9)\n5Success!"
},
{