build based on f5ff563

This commit is contained in:
autodocs 2017-03-04 14:07:25 +00:00
parent 4470a9cace
commit 14414ac216
13 changed files with 14 additions and 14 deletions

View File

@ -150,7 +150,7 @@ Backends
</a> </a>
</li> </li>
</ul> </ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/304644ed0ec394879126f65f73a6a11ee5fd094a/docs/src/apis/backends.md"> <a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/f5ff563f0a75c95dfb001e97a7567a47ed9aeeea/docs/src/apis/backends.md">
<span class="fa"> <span class="fa">
</span> </span>

View File

@ -155,7 +155,7 @@ Batching
</a> </a>
</li> </li>
</ul> </ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/304644ed0ec394879126f65f73a6a11ee5fd094a/docs/src/apis/batching.md"> <a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/f5ff563f0a75c95dfb001e97a7567a47ed9aeeea/docs/src/apis/batching.md">
<span class="fa"> <span class="fa">
</span> </span>

View File

@ -139,7 +139,7 @@ Storing Models
</a> </a>
</li> </li>
</ul> </ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/304644ed0ec394879126f65f73a6a11ee5fd094a/docs/src/apis/storage.md"> <a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/f5ff563f0a75c95dfb001e97a7567a47ed9aeeea/docs/src/apis/storage.md">
<span class="fa"> <span class="fa">
</span> </span>

View File

@ -136,7 +136,7 @@ Contributing &amp; Help
</a> </a>
</li> </li>
</ul> </ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/304644ed0ec394879126f65f73a6a11ee5fd094a/docs/src/contributing.md"> <a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/f5ff563f0a75c95dfb001e97a7567a47ed9aeeea/docs/src/contributing.md">
<span class="fa"> <span class="fa">
</span> </span>

View File

@ -139,7 +139,7 @@ Char RNN
</a> </a>
</li> </li>
</ul> </ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/304644ed0ec394879126f65f73a6a11ee5fd094a/docs/src/examples/char-rnn.md"> <a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/f5ff563f0a75c95dfb001e97a7567a47ed9aeeea/docs/src/examples/char-rnn.md">
<span class="fa"> <span class="fa">
</span> </span>

View File

@ -139,7 +139,7 @@ Logistic Regression
</a> </a>
</li> </li>
</ul> </ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/304644ed0ec394879126f65f73a6a11ee5fd094a/docs/src/examples/logreg.md"> <a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/f5ff563f0a75c95dfb001e97a7567a47ed9aeeea/docs/src/examples/logreg.md">
<span class="fa"> <span class="fa">
</span> </span>

View File

@ -147,7 +147,7 @@ Home
</a> </a>
</li> </li>
</ul> </ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/304644ed0ec394879126f65f73a6a11ee5fd094a/docs/src/index.md"> <a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/f5ff563f0a75c95dfb001e97a7567a47ed9aeeea/docs/src/index.md">
<span class="fa"> <span class="fa">
</span> </span>

View File

@ -136,7 +136,7 @@ Internals
</a> </a>
</li> </li>
</ul> </ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/304644ed0ec394879126f65f73a6a11ee5fd094a/docs/src/internals.md"> <a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/f5ff563f0a75c95dfb001e97a7567a47ed9aeeea/docs/src/internals.md">
<span class="fa"> <span class="fa">
</span> </span>

View File

@ -155,7 +155,7 @@ Model Building Basics
</a> </a>
</li> </li>
</ul> </ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/304644ed0ec394879126f65f73a6a11ee5fd094a/docs/src/models/basics.md"> <a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/f5ff563f0a75c95dfb001e97a7567a47ed9aeeea/docs/src/models/basics.md">
<span class="fa"> <span class="fa">
</span> </span>
@ -229,7 +229,7 @@ softmax(affine2(x1)) # [0.125361, 0.246448, 0.21966, 0.124596, 0.283935]</code><
<p> <p>
We just created two separate We just created two separate
<code>Affine</code> <code>Affine</code>
layers, and each contains its own version of layers, and each contains its own (randomly initialised) version of
<code>W</code> <code>W</code>
and and
<code>b</code> <code>b</code>

View File

@ -139,7 +139,7 @@ Debugging
</a> </a>
</li> </li>
</ul> </ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/304644ed0ec394879126f65f73a6a11ee5fd094a/docs/src/models/debugging.md"> <a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/f5ff563f0a75c95dfb001e97a7567a47ed9aeeea/docs/src/models/debugging.md">
<span class="fa"> <span class="fa">
</span> </span>

View File

@ -139,7 +139,7 @@ Recurrence
</a> </a>
</li> </li>
</ul> </ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/304644ed0ec394879126f65f73a6a11ee5fd094a/docs/src/models/recurrent.md"> <a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/f5ff563f0a75c95dfb001e97a7567a47ed9aeeea/docs/src/models/recurrent.md">
<span class="fa"> <span class="fa">
</span> </span>

View File

@ -155,7 +155,7 @@ Model Templates
</a> </a>
</li> </li>
</ul> </ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/304644ed0ec394879126f65f73a6a11ee5fd094a/docs/src/models/templates.md"> <a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/f5ff563f0a75c95dfb001e97a7567a47ed9aeeea/docs/src/models/templates.md">
<span class="fa"> <span class="fa">
</span> </span>

View File

@ -53,7 +53,7 @@ var documenterSearchIndex = {"docs": [
"page": "Model Building Basics", "page": "Model Building Basics",
"title": "The Model", "title": "The Model",
"category": "section", "category": "section",
"text": "... Initialising Photon Beams ...The core concept in Flux is the model. A model (or \"layer\") is simply a function with parameters. For example, in plain Julia code, we could define the following function to represent a logistic regression (or simple neural network):W = randn(3,5)\nb = randn(3)\naffine(x) = W * x + b\n\nx1 = rand(5) # [0.581466,0.606507,0.981732,0.488618,0.415414]\ny1 = softmax(affine(x1)) # [0.32676,0.0974173,0.575823]affine is simply a function which takes some vector x1 and outputs a new one y1. For example, x1 could be data from an image and y1 could be predictions about the content of that image. However, affine isn't static. It has parameters W and b, and if we tweak those parameters we'll tweak the result hopefully to make the predictions more accurate.This is all well and good, but we usually want to have more than one affine layer in our network; writing out the above definition to create new sets of parameters every time would quickly become tedious. For that reason, we want to use a template which creates these functions for us:affine1 = Affine(5, 5)\naffine2 = Affine(5, 5)\n\nsoftmax(affine1(x1)) # [0.167952, 0.186325, 0.176683, 0.238571, 0.23047]\nsoftmax(affine2(x1)) # [0.125361, 0.246448, 0.21966, 0.124596, 0.283935]We just created two separate Affine layers, and each contains its own version of W and b, leading to a different result when called with our data. It's easy to define templates like Affine ourselves (see templates), but Flux provides Affine out of the box, so we'll use that for now." "text": "... Initialising Photon Beams ...The core concept in Flux is the model. A model (or \"layer\") is simply a function with parameters. For example, in plain Julia code, we could define the following function to represent a logistic regression (or simple neural network):W = randn(3,5)\nb = randn(3)\naffine(x) = W * x + b\n\nx1 = rand(5) # [0.581466,0.606507,0.981732,0.488618,0.415414]\ny1 = softmax(affine(x1)) # [0.32676,0.0974173,0.575823]affine is simply a function which takes some vector x1 and outputs a new one y1. For example, x1 could be data from an image and y1 could be predictions about the content of that image. However, affine isn't static. It has parameters W and b, and if we tweak those parameters we'll tweak the result hopefully to make the predictions more accurate.This is all well and good, but we usually want to have more than one affine layer in our network; writing out the above definition to create new sets of parameters every time would quickly become tedious. For that reason, we want to use a template which creates these functions for us:affine1 = Affine(5, 5)\naffine2 = Affine(5, 5)\n\nsoftmax(affine1(x1)) # [0.167952, 0.186325, 0.176683, 0.238571, 0.23047]\nsoftmax(affine2(x1)) # [0.125361, 0.246448, 0.21966, 0.124596, 0.283935]We just created two separate Affine layers, and each contains its own (randomly initialised) version of W and b, leading to a different result when called with our data. It's easy to define templates like Affine ourselves (see templates), but Flux provides Affine out of the box, so we'll use that for now."
}, },
{ {