</script><linkhref="https://cdnjs.cloudflare.com/ajax/libs/normalize/4.2.0/normalize.min.css"rel="stylesheet"type="text/css"/><linkhref="https://fonts.googleapis.com/css?family=Lato|Roboto+Mono"rel="stylesheet"type="text/css"/><linkhref="https://cdnjs.cloudflare.com/ajax/libs/font-awesome/4.6.3/css/font-awesome.min.css"rel="stylesheet"type="text/css"/><linkhref="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/9.12.0/styles/default.min.css"rel="stylesheet"type="text/css"/><script>documenterBaseURL=".."</script><scriptsrc="https://cdnjs.cloudflare.com/ajax/libs/require.js/2.2.0/require.min.js"data-main="../assets/documenter.js"></script><scriptsrc="../siteinfo.js"></script><scriptsrc="../../versions.js"></script><linkhref="../assets/documenter.css"rel="stylesheet"type="text/css"/><linkhref="../../flux.css"rel="stylesheet"type="text/css"/></head><body><navclass="toc"><h1>Flux</h1><selectid="version-selector"onChange="window.location.href=this.value"style="visibility: hidden"></select><formclass="search"id="search-form"action="../search.html"><inputid="search-query"name="q"type="text"placeholder="Search docs"/></form><ul><li><aclass="toctext"href="../index.html">Home</a></li><li><spanclass="toctext">Building Models</span><ul><li><aclass="toctext"href="basics.html">Basics</a></li><li><aclass="toctext"href="recurrence.html">Recurrence</a></li><liclass="current"><aclass="toctext"href="layers.html">Model Reference</a><ulclass="internal"><li><aclass="toctext"href="#Layers-1">Layers</a></li><li><aclass="toctext"href="#Recurrent-Cells-1">Recurrent Cells</a></li><li><aclass="toctext"href="#Activation-Functions-1">Activation Functions</a></li></ul></li></ul></li><li><spanclass="toctext">Training Models</span><ul><li><aclass="toctext"href="../training/optimisers.html">Optimisers</a></li><li><aclass="toctext"href="../training/training.html">Training</a></li></ul></li><li><aclass="toctext"href="../data/onehot.html">One-Hot Encoding</a></li><li><aclass="toctext"href="../gpu.html">GPU Support</a></li><li><aclass="toctext"href="../community.html">Community</a></li></ul></nav><articleid="docs"><header><nav><ul><li>Building Models</li><li><ahref="layers.html">Model Reference</a></li></ul><aclass="edit-page"href="https://github.com/FluxML/Flux.jl/blob/master/docs/src/models/layers.md"><spanclass="fa"></span> Edit on GitHub</a></nav><hr/><divid="topbar"><span>Model Reference</span><aclass="fa fa-bars"href="#"></a></div></header><h2><aclass="nav-anchor"id="Layers-1"href="#Layers-1">Layers</a></h2><p>These core layers form the foundation of almost all neural networks.</p><sectionclass="docstring"><divclass="docstring-header"><aclass="docstring-binding"id="Flux.Chain"href="#Flux.Chain"><code>Flux.Chain</code></a> — <spanclass="docstring-category">Type</span>.</div><div><pre><codeclass="language-none">Chain(layers...)</code></pre><p>Chain multiple layers / functions together, so that they are called in sequence on a given input.</p><pre><codeclass="language-none">m = Chain(x -> x^2, x -> x+1)
m(x) == m[2](m[1](x))</code></pre><p><code>Chain</code> also supports indexing and slicing, e.g. <code>m[2]</code> or <code>m[1:end-1]</code>. <code>m[1:3](x)</code> will calculate the output of the first three layers.</p></div><aclass="source-link"target="_blank"href="https://github.com/FluxML/Flux.jl/blob/fd249b773e6ba1b6f0846a6c2400475f348df87b/src/layers/basic.jl#L1-L16">source</a></section><sectionclass="docstring"><divclass="docstring-header"><aclass="docstring-binding"id="Flux.Dense"href="#Flux.Dense"><code>Flux.Dense</code></a> — <spanclass="docstring-category">Type</span>.</div><div><pre><codeclass="language-none">Dense(in::Integer, out::Integer, σ = identity)</code></pre><p>Creates a traditional <code>Dense</code> layer with parameters <code>W</code> and <code>b</code>.</p><pre><codeclass="language-none">y = σ.(W * x .+ b)</code></pre><p>The input <code>x</code> must be a vector of length <code>in</code>, or a batch of vectors represented as an <code>in × N</code> matrix. The out <code>y</code> will be a vector or batch of length <code>out</code>.</p><pre><codeclass="language-none">julia> d = Dense(5, 2)
-0.00449443</code></pre></div><aclass="source-link"target="_blank"href="https://github.com/FluxML/Flux.jl/blob/fd249b773e6ba1b6f0846a6c2400475f348df87b/src/layers/basic.jl#L38-L55">source</a></section><h2><aclass="nav-anchor"id="Recurrent-Cells-1"href="#Recurrent-Cells-1">Recurrent Cells</a></h2><p>Much like the core layers above, but can be used to process sequence data (as well as other kinds of structured data).</p><pre><codeclass="language-none">RNN
LSTM
Recur</code></pre><h2><aclass="nav-anchor"id="Activation-Functions-1"href="#Activation-Functions-1">Activation Functions</a></h2><p>Non-linearities that go between layers of your model. Most of these functions are defined in <ahref="https://github.com/FluxML/NNlib.jl">NNlib</a> but are available by default in Flux.</p><p>Note that, unless otherwise stated, activation functions operate on scalars. To apply them to an array you can call <code>σ.(xs)</code>, <code>relu.(xs)</code> and so on.</p><sectionclass="docstring"><divclass="docstring-header"><aclass="docstring-binding"id="NNlib.σ"href="#NNlib.σ"><code>NNlib.σ</code></a> — <spanclass="docstring-category">Function</span>.</div><div><pre><codeclass="language-none">σ(x) = 1 / (1 + exp(-x))</code></pre><p>Classic <ahref="https://en.wikipedia.org/wiki/Sigmoid_function">sigmoid</a> activation function.</p></div><aclass="source-link"target="_blank"href="https://github.com/FluxML/NNlib.jl/blob/e4b48c1f41b2786ae5d1efef1ba54ff82eeeb49c/src/activation.jl#L1-L6">source</a></section><sectionclass="docstring"><divclass="docstring-header"><aclass="docstring-binding"id="NNlib.relu"href="#NNlib.relu"><code>NNlib.relu</code></a> — <spanclass="docstring-category">Function</span>.</div><div><pre><codeclass="language-none">relu(x) = max(0, x)</code></pre><p><ahref="https://en.wikipedia.org/wiki/Rectifier_(neural_networks)">Rectified Linear Unit</a> activation function.</p></div><aclass="source-link"target="_blank"href="https://github.com/FluxML/NNlib.jl/blob/e4b48c1f41b2786ae5d1efef1ba54ff82eeeb49c/src/activation.jl#L12-L17">source</a></section><sectionclass="docstring"><divclass="docstring-header"><aclass="docstring-binding"id="NNlib.leakyrelu"href="#NNlib.leakyrelu"><code>NNlib.leakyrelu</code></a> — <spanclass="docstring-category">Function</span>.</div><div><pre><codeclass="language-none">leakyrelu(x) = max(0.01x, x)</code></pre><p>Leaky <ahref="https://en.wikipedia.org/wiki/Rectifier_(neural_networks)">Rectified Linear Unit</a> activation function.</p><p>You can also specify the coefficient explicitly, e.g. <code>leakyrelu(x, 0.01)</code>.</p></div><aclass="source-link"target="_blank"href="https://github.com/FluxML/NNlib.jl/blob/e4b48c1f41b2786ae5d1efef1ba54ff82eeeb49c/src/activation.jl#L20-L27">source</a></section><sectionclass="docstring"><divclass="docstring-header"><aclass="docstring-binding"id="NNlib.elu"href="#NNlib.elu"><code>NNlib.elu</code></a> — <spanclass="docstring-category">Function</span>.</div><div><pre><codeclass="language-none">elu(x; α = 1) = x > 0 ? x : α * (exp(x) - one(x)</code></pre><p>Exponential Linear Unit activation function. See <ahref="https://arxiv.org/abs/1511.07289">Fast and Accurate Deep Network Learning by Exponential Linear Units</a></p></div><aclass="source-link"target="_blank"href="https://github.com/FluxML/NNlib.jl/blob/e4b48c1f41b2786ae5d1efef1ba54ff82eeeb49c/src/activation.jl#L30-L35">source</a></section><sectionclass="docstring"><divclass="docstring-header"><aclass="docstring-binding"id="NNlib.swish"href="#NNlib.swish"><code>NNlib.swish</code></a> — <spanclass="docstring-category">Function</span>.</div><div><pre><codeclass="language-none">swish(x) = x * σ(x)</code></pre><p>Self-gated actvation function.</p><p>See <ahref="https://arxiv.org/pdf/1710.05941.pdf">Swish: a Self-Gated Activation Function</a>.</p></div><aclass="source-link"target="_blank"href="https://github.com/FluxML/NNlib.jl/blob/e4b48c1f41b2786ae5d1efef1ba54ff82eeeb49c/src/activation.jl#L38-L44">source</a></section><footer><hr/><aclass="previous"href="recurrence.html"><spanclass="direction">Previous</span><spanclass="title">Recurrence</span></a><aclass="next"href="../training/optimisers.html"><spanclass="direction">Next</span><spanclass="title">Optimisers</span></a></footer></article></body></html>