build based on 8c1affd
This commit is contained in:
parent
fa31ba1ea3
commit
b2700920bc
@ -88,7 +88,18 @@ Batching
|
||||
<a class="toctext" href="backends.html">
|
||||
Backends
|
||||
</a>
|
||||
<ul class="internal"></ul>
|
||||
<ul class="internal">
|
||||
<li>
|
||||
<a class="toctext" href="#Basic-Usage-1">
|
||||
Basic Usage
|
||||
</a>
|
||||
</li>
|
||||
<li>
|
||||
<a class="toctext" href="#Native-Integration-1">
|
||||
Native Integration
|
||||
</a>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
@ -129,7 +140,7 @@ Backends
|
||||
</a>
|
||||
</li>
|
||||
</ul>
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/08b67d9b76d93cf4c7ae971a4e5cf9ba07a7df69/docs/src/apis/backends.md">
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/8c1affd9c75844766c5433c6ef3d39dfe6324edd/docs/src/apis/backends.md">
|
||||
<span class="fa">
|
||||
|
||||
</span>
|
||||
@ -139,12 +150,74 @@ Backends
|
||||
<hr/>
|
||||
</header>
|
||||
<h1>
|
||||
<a class="nav-anchor" id="Batching-1" href="#Batching-1">
|
||||
Batching
|
||||
<a class="nav-anchor" id="Backends-1" href="#Backends-1">
|
||||
Backends
|
||||
</a>
|
||||
</h1>
|
||||
<h2>
|
||||
<a class="nav-anchor" id="Basic-Usage-1" href="#Basic-Usage-1">
|
||||
Basic Usage
|
||||
</a>
|
||||
</h2>
|
||||
<pre><code class="language-julia">model = Chain(Affine(10, 20), σ, Affine(20, 15), softmax)
|
||||
xs = rand(10)</code></pre>
|
||||
<p>
|
||||
[WIP]
|
||||
Currently, Flux's pure-Julia backend has no optimisations. This means that calling
|
||||
</p>
|
||||
<pre><code class="language-julia">model(rand(10)) #> [0.0650, 0.0655, ...]</code></pre>
|
||||
<p>
|
||||
directly won't have great performance. In order to support a computationally intensive training process, we really on a backend like MXNet or TensorFlow.
|
||||
</p>
|
||||
<p>
|
||||
This is easy to do. Just call either
|
||||
<code>mxnet</code>
|
||||
or
|
||||
<code>tf</code>
|
||||
on a model to convert it to a model of that kind:
|
||||
</p>
|
||||
<pre><code class="language-julia">mxmodel = mxnet(model, (10, 1))
|
||||
mxmodel(xs) #> [0.0650, 0.0655, ...]
|
||||
# or
|
||||
tfmodel = tf(model)
|
||||
tfmodel(xs) #> [0.0650, 0.0655, ...]</code></pre>
|
||||
<p>
|
||||
These new models look and feel exactly like every other model in Flux, including returning the same result when you call them, and can be trained as usual using
|
||||
<code>Flux.train!()</code>
|
||||
. The difference is that the computation is being carried out by a backend, which will usually give a large speedup.
|
||||
</p>
|
||||
<h2>
|
||||
<a class="nav-anchor" id="Native-Integration-1" href="#Native-Integration-1">
|
||||
Native Integration
|
||||
</a>
|
||||
</h2>
|
||||
<p>
|
||||
Flux aims to provide high-level APIs that work well across backends, but in some cases you may want to take advantage of features specific to a given backend. In these cases it's easy to "drop down" and use the backend's API directly, where appropriate. For example:
|
||||
</p>
|
||||
<pre><code class="language-julia">using MXNet
|
||||
Flux.loadmx()
|
||||
|
||||
mxmodel = mx.FeedForward(model)</code></pre>
|
||||
<p>
|
||||
This returns a standard
|
||||
<code>mx.FeedForward</code>
|
||||
instance, just like you might have created using MXNet's usual API. You can then use this with MXNet's data provider implementation, custom optimisers, or distributed training processes.
|
||||
</p>
|
||||
<p>
|
||||
Same goes for TensorFlow, where it's easy to create a
|
||||
<code>Tensor</code>
|
||||
object:
|
||||
</p>
|
||||
<pre><code class="language-julia">using TensorFlow
|
||||
Flux.loadtf()
|
||||
|
||||
x = placeholder(Float32)
|
||||
y = Tensor(model, x)</code></pre>
|
||||
<p>
|
||||
This makes makes it easy to take advantage of Flux's model description and debugging tools while also getting the benefit of the work put into these backends. You can check out how this looks with the integration examples
|
||||
<a href="https://github.com/MikeInnes/Flux.jl/tree/master/examples">
|
||||
here
|
||||
</a>
|
||||
.
|
||||
</p>
|
||||
<footer>
|
||||
<hr/>
|
||||
|
@ -145,7 +145,7 @@ Batching
|
||||
</a>
|
||||
</li>
|
||||
</ul>
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/08b67d9b76d93cf4c7ae971a4e5cf9ba07a7df69/docs/src/apis/batching.md">
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/8c1affd9c75844766c5433c6ef3d39dfe6324edd/docs/src/apis/batching.md">
|
||||
<span class="fa">
|
||||
|
||||
</span>
|
||||
|
@ -126,7 +126,7 @@ Contributing & Help
|
||||
</a>
|
||||
</li>
|
||||
</ul>
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/08b67d9b76d93cf4c7ae971a4e5cf9ba07a7df69/docs/src/contributing.md">
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/8c1affd9c75844766c5433c6ef3d39dfe6324edd/docs/src/contributing.md">
|
||||
<span class="fa">
|
||||
|
||||
</span>
|
||||
|
@ -129,7 +129,7 @@ Logistic Regression
|
||||
</a>
|
||||
</li>
|
||||
</ul>
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/08b67d9b76d93cf4c7ae971a4e5cf9ba07a7df69/docs/src/examples/logreg.md">
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/8c1affd9c75844766c5433c6ef3d39dfe6324edd/docs/src/examples/logreg.md">
|
||||
<span class="fa">
|
||||
|
||||
</span>
|
||||
|
@ -132,7 +132,7 @@ Home
|
||||
</a>
|
||||
</li>
|
||||
</ul>
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/08b67d9b76d93cf4c7ae971a4e5cf9ba07a7df69/docs/src/index.md">
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/8c1affd9c75844766c5433c6ef3d39dfe6324edd/docs/src/index.md">
|
||||
<span class="fa">
|
||||
|
||||
</span>
|
||||
|
@ -126,7 +126,7 @@ Internals
|
||||
</a>
|
||||
</li>
|
||||
</ul>
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/08b67d9b76d93cf4c7ae971a4e5cf9ba07a7df69/docs/src/internals.md">
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/8c1affd9c75844766c5433c6ef3d39dfe6324edd/docs/src/internals.md">
|
||||
<span class="fa">
|
||||
|
||||
</span>
|
||||
|
@ -145,7 +145,7 @@ Model Building Basics
|
||||
</a>
|
||||
</li>
|
||||
</ul>
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/08b67d9b76d93cf4c7ae971a4e5cf9ba07a7df69/docs/src/models/basics.md">
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/8c1affd9c75844766c5433c6ef3d39dfe6324edd/docs/src/models/basics.md">
|
||||
<span class="fa">
|
||||
|
||||
</span>
|
||||
|
@ -129,7 +129,7 @@ Debugging
|
||||
</a>
|
||||
</li>
|
||||
</ul>
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/08b67d9b76d93cf4c7ae971a4e5cf9ba07a7df69/docs/src/models/debugging.md">
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/8c1affd9c75844766c5433c6ef3d39dfe6324edd/docs/src/models/debugging.md">
|
||||
<span class="fa">
|
||||
|
||||
</span>
|
||||
|
@ -129,7 +129,7 @@ Recurrence
|
||||
</a>
|
||||
</li>
|
||||
</ul>
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/08b67d9b76d93cf4c7ae971a4e5cf9ba07a7df69/docs/src/models/recurrent.md">
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/8c1affd9c75844766c5433c6ef3d39dfe6324edd/docs/src/models/recurrent.md">
|
||||
<span class="fa">
|
||||
|
||||
</span>
|
||||
|
@ -145,7 +145,7 @@ Model Templates
|
||||
</a>
|
||||
</li>
|
||||
</ul>
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/08b67d9b76d93cf4c7ae971a4e5cf9ba07a7df69/docs/src/models/templates.md">
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/8c1affd9c75844766c5433c6ef3d39dfe6324edd/docs/src/models/templates.md">
|
||||
<span class="fa">
|
||||
|
||||
</span>
|
||||
|
@ -185,11 +185,27 @@ var documenterSearchIndex = {"docs": [
|
||||
},
|
||||
|
||||
{
|
||||
"location": "apis/backends.html#Batching-1",
|
||||
"location": "apis/backends.html#Backends-1",
|
||||
"page": "Backends",
|
||||
"title": "Batching",
|
||||
"title": "Backends",
|
||||
"category": "section",
|
||||
"text": "[WIP]"
|
||||
"text": ""
|
||||
},
|
||||
|
||||
{
|
||||
"location": "apis/backends.html#Basic-Usage-1",
|
||||
"page": "Backends",
|
||||
"title": "Basic Usage",
|
||||
"category": "section",
|
||||
"text": "model = Chain(Affine(10, 20), σ, Affine(20, 15), softmax)\nxs = rand(10)Currently, Flux's pure-Julia backend has no optimisations. This means that callingmodel(rand(10)) #> [0.0650, 0.0655, ...]directly won't have great performance. In order to support a computationally intensive training process, we really on a backend like MXNet or TensorFlow.This is easy to do. Just call either mxnet or tf on a model to convert it to a model of that kind:mxmodel = mxnet(model, (10, 1))\nmxmodel(xs) #> [0.0650, 0.0655, ...]\n# or\ntfmodel = tf(model)\ntfmodel(xs) #> [0.0650, 0.0655, ...]These new models look and feel exactly like every other model in Flux, including returning the same result when you call them, and can be trained as usual using Flux.train!(). The difference is that the computation is being carried out by a backend, which will usually give a large speedup."
|
||||
},
|
||||
|
||||
{
|
||||
"location": "apis/backends.html#Native-Integration-1",
|
||||
"page": "Backends",
|
||||
"title": "Native Integration",
|
||||
"category": "section",
|
||||
"text": "Flux aims to provide high-level APIs that work well across backends, but in some cases you may want to take advantage of features specific to a given backend. In these cases it's easy to \"drop down\" and use the backend's API directly, where appropriate. For example:using MXNet\nFlux.loadmx()\n\nmxmodel = mx.FeedForward(model)This returns a standard mx.FeedForward instance, just like you might have created using MXNet's usual API. You can then use this with MXNet's data provider implementation, custom optimisers, or distributed training processes.Same goes for TensorFlow, where it's easy to create a Tensor object:using TensorFlow\nFlux.loadtf()\n\nx = placeholder(Float32)\ny = Tensor(model, x)This makes makes it easy to take advantage of Flux's model description and debugging tools while also getting the benefit of the work put into these backends. You can check out how this looks with the integration examples here."
|
||||
},
|
||||
|
||||
{
|
||||
|
Loading…
Reference in New Issue
Block a user