build based on b2b2c20

This commit is contained in:
autodocs 2017-02-21 17:19:13 +00:00
parent 6463994559
commit 4ea75ed5e4
11 changed files with 12 additions and 12 deletions

View File

@ -129,7 +129,7 @@ Backends
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/b49daa367f493907788a45c468d018f27c7bce71/docs/src/apis/backends.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/b2b2c209c69764598896b6784fa3095d244c0e85/docs/src/apis/backends.md">
<span class="fa">
</span>

View File

@ -145,7 +145,7 @@ Batching
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/b49daa367f493907788a45c468d018f27c7bce71/docs/src/apis/batching.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/b2b2c209c69764598896b6784fa3095d244c0e85/docs/src/apis/batching.md">
<span class="fa">
</span>
@ -298,7 +298,7 @@ Right now, the
<ul>
<li>
<p>
Code can be written in a batch-agnostic way, i.e. as if working with a single data point, with batching happening independently.
Code can be written in a batch-agnostic way or be generic across batching setups. Code works with a single data point, and batching happens independently.
</p>
</li>
<li>

View File

@ -126,7 +126,7 @@ Contributing &amp; Help
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/b49daa367f493907788a45c468d018f27c7bce71/docs/src/contributing.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/b2b2c209c69764598896b6784fa3095d244c0e85/docs/src/contributing.md">
<span class="fa">
</span>

View File

@ -129,7 +129,7 @@ Logistic Regression
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/b49daa367f493907788a45c468d018f27c7bce71/docs/src/examples/logreg.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/b2b2c209c69764598896b6784fa3095d244c0e85/docs/src/examples/logreg.md">
<span class="fa">
</span>

View File

@ -132,7 +132,7 @@ Home
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/b49daa367f493907788a45c468d018f27c7bce71/docs/src/index.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/b2b2c209c69764598896b6784fa3095d244c0e85/docs/src/index.md">
<span class="fa">
</span>

View File

@ -126,7 +126,7 @@ Internals
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/b49daa367f493907788a45c468d018f27c7bce71/docs/src/internals.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/b2b2c209c69764598896b6784fa3095d244c0e85/docs/src/internals.md">
<span class="fa">
</span>

View File

@ -145,7 +145,7 @@ Model Building Basics
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/b49daa367f493907788a45c468d018f27c7bce71/docs/src/models/basics.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/b2b2c209c69764598896b6784fa3095d244c0e85/docs/src/models/basics.md">
<span class="fa">
</span>

View File

@ -129,7 +129,7 @@ Debugging
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/b49daa367f493907788a45c468d018f27c7bce71/docs/src/models/debugging.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/b2b2c209c69764598896b6784fa3095d244c0e85/docs/src/models/debugging.md">
<span class="fa">
</span>

View File

@ -129,7 +129,7 @@ Recurrence
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/b49daa367f493907788a45c468d018f27c7bce71/docs/src/models/recurrent.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/b2b2c209c69764598896b6784fa3095d244c0e85/docs/src/models/recurrent.md">
<span class="fa">
</span>

View File

@ -145,7 +145,7 @@ Model Templates
</a>
</li>
</ul>
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/b49daa367f493907788a45c468d018f27c7bce71/docs/src/models/templates.md">
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/b2b2c209c69764598896b6784fa3095d244c0e85/docs/src/models/templates.md">
<span class="fa">
</span>

View File

@ -173,7 +173,7 @@ var documenterSearchIndex = {"docs": [
"page": "Batching",
"title": "Future Work",
"category": "section",
"text": "The design of batching is still a fairly early work in progress, though it's used in a few places in the system. For example, all Flux models expect to be given Batch objects which are unwrapped into raw arrays for the computation. Models will convert their arguments if necessary, so it's convenient to call a model with a single data point like f([1,2,3]).Right now, the Batch or Seq types always stack along the left-most dimension. In future, this will be customisable, and Flux will provide implementations of common functions that are generic across the batch dimension. This brings the following benefits:Code can be written in a batch-agnostic way, i.e. as if working with a single data point, with batching happening independently.\nAutomatic batching can be done with correctness assured, reducing programmer errors when manipulating dimensions.\nOptimisations, like switching batch dimensions, can be expressed by the programmer with compiler support; fewer code changes are required and optimisations are guaranteed not to break the model.\nThis also opens the door for more automatic optimisations, e.g. having the compiler explore the search base of possible batching combinations."
"text": "The design of batching is still a fairly early work in progress, though it's used in a few places in the system. For example, all Flux models expect to be given Batch objects which are unwrapped into raw arrays for the computation. Models will convert their arguments if necessary, so it's convenient to call a model with a single data point like f([1,2,3]).Right now, the Batch or Seq types always stack along the left-most dimension. In future, this will be customisable, and Flux will provide implementations of common functions that are generic across the batch dimension. This brings the following benefits:Code can be written in a batch-agnostic way or be generic across batching setups. Code works with a single data point, and batching happens independently.\nAutomatic batching can be done with correctness assured, reducing programmer errors when manipulating dimensions.\nOptimisations, like switching batch dimensions, can be expressed by the programmer with compiler support; fewer code changes are required and optimisations are guaranteed not to break the model.\nThis also opens the door for more automatic optimisations, e.g. having the compiler explore the search base of possible batching combinations."
},
{