build based on 90616a3

This commit is contained in:
autodocs 2017-01-18 23:26:14 +00:00
parent 45391f6f86
commit 6c6bcdb846
8 changed files with 9 additions and 9 deletions

View File

@ -104,7 +104,7 @@ Contributing & Help
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/d6bc10e3be6f43cec501ff6720cc2386284d0963/docs/src/contributing.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/90616a3c5a98d2f1ea35c7ba046df0ffd8358130/docs/src/contributing.md">
<span class="fa">
</span>

View File

@ -107,7 +107,7 @@ Logistic Regression
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/d6bc10e3be6f43cec501ff6720cc2386284d0963/docs/src/examples/logreg.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/90616a3c5a98d2f1ea35c7ba046df0ffd8358130/docs/src/examples/logreg.md">
<span class="fa">
</span>

View File

@ -104,7 +104,7 @@ Home
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/d6bc10e3be6f43cec501ff6720cc2386284d0963/docs/src/index.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/90616a3c5a98d2f1ea35c7ba046df0ffd8358130/docs/src/index.md">
<span class="fa">
</span>

View File

@ -104,7 +104,7 @@ Internals
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/d6bc10e3be6f43cec501ff6720cc2386284d0963/docs/src/internals.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/90616a3c5a98d2f1ea35c7ba046df0ffd8358130/docs/src/internals.md">
<span class="fa">
</span>

View File

@ -133,7 +133,7 @@ First Steps
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/d6bc10e3be6f43cec501ff6720cc2386284d0963/docs/src/models/basics.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/90616a3c5a98d2f1ea35c7ba046df0ffd8358130/docs/src/models/basics.md">
<span class="fa">
</span>
@ -266,7 +266,7 @@ mymodel2(x2) # [0.187935, 0.232237, 0.169824, 0.230589, 0.179414]</code></pre>
Affine(5, 5), σ,
Affine(5, 5), softmax)</code></pre>
<p>
You now know understand enough to take a look at the
You now know enough to take a look at the
<a href="../examples/logreg.html">
logistic regression
</a>

View File

@ -107,7 +107,7 @@ Debugging
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/d6bc10e3be6f43cec501ff6720cc2386284d0963/docs/src/models/debugging.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/90616a3c5a98d2f1ea35c7ba046df0ffd8358130/docs/src/models/debugging.md">
<span class="fa">
</span>

View File

@ -107,7 +107,7 @@ Recurrence
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/d6bc10e3be6f43cec501ff6720cc2386284d0963/docs/src/models/recurrent.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/90616a3c5a98d2f1ea35c7ba046df0ffd8358130/docs/src/models/recurrent.md">
<span class="fa">
</span>

View File

@ -53,7 +53,7 @@ var documenterSearchIndex = {"docs": [
"page": "First Steps",
"title": "Combining Models",
"category": "section",
"text": "... Inflating Graviton Zeppelins ...A more complex model usually involves many basic layers like affine, where we use the output of one layer as the input to the next:mymodel1(x) = softmax(affine2(σ(affine1(x))))\nmymodel1(x1) # [0.187935, 0.232237, 0.169824, 0.230589, 0.179414]This syntax is again a little unwieldy for larger networks, so Flux provides another template of sorts to create the function for us:mymodel2 = Chain(affine1, σ, affine2, softmax)\nmymodel2(x2) # [0.187935, 0.232237, 0.169824, 0.230589, 0.179414]mymodel2 is exactly equivalent to mymodel1 because it simply calls the provided functions in sequence. We don't have to predefine the affine layers and can also write this as:mymodel3 = Chain(\n Affine(5, 5), σ,\n Affine(5, 5), softmax)You now know understand enough to take a look at the logistic regression example, if you haven't already."
"text": "... Inflating Graviton Zeppelins ...A more complex model usually involves many basic layers like affine, where we use the output of one layer as the input to the next:mymodel1(x) = softmax(affine2(σ(affine1(x))))\nmymodel1(x1) # [0.187935, 0.232237, 0.169824, 0.230589, 0.179414]This syntax is again a little unwieldy for larger networks, so Flux provides another template of sorts to create the function for us:mymodel2 = Chain(affine1, σ, affine2, softmax)\nmymodel2(x2) # [0.187935, 0.232237, 0.169824, 0.230589, 0.179414]mymodel2 is exactly equivalent to mymodel1 because it simply calls the provided functions in sequence. We don't have to predefine the affine layers and can also write this as:mymodel3 = Chain(\n Affine(5, 5), σ,\n Affine(5, 5), softmax)You now know enough to take a look at the logistic regression example, if you haven't already."
},
{