24 lines
12 KiB
HTML
24 lines
12 KiB
HTML
<!DOCTYPE html>
|
||
<html lang="en"><head><meta charset="UTF-8"/><meta name="viewport" content="width=device-width, initial-scale=1.0"/><title>NNlib · Flux</title><script>(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
|
||
(i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
|
||
m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
|
||
})(window,document,'script','https://www.google-analytics.com/analytics.js','ga');
|
||
|
||
ga('create', 'UA-36890222-9', 'auto');
|
||
ga('send', 'pageview');
|
||
</script><link href="https://cdnjs.cloudflare.com/ajax/libs/normalize/4.2.0/normalize.min.css" rel="stylesheet" type="text/css"/><link href="https://fonts.googleapis.com/css?family=Lato|Roboto+Mono" rel="stylesheet" type="text/css"/><link href="https://cdnjs.cloudflare.com/ajax/libs/font-awesome/4.6.3/css/font-awesome.min.css" rel="stylesheet" type="text/css"/><link href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/9.12.0/styles/default.min.css" rel="stylesheet" type="text/css"/><script>documenterBaseURL="../.."</script><script src="https://cdnjs.cloudflare.com/ajax/libs/require.js/2.2.0/require.min.js" data-main="../../assets/documenter.js"></script><script src="../../siteinfo.js"></script><script src="../../../versions.js"></script><link href="../../assets/documenter.css" rel="stylesheet" type="text/css"/><link href="../../assets/flux.css" rel="stylesheet" type="text/css"/></head><body><nav class="toc"><h1>Flux</h1><select id="version-selector" onChange="window.location.href=this.value" style="visibility: hidden"></select><form class="search" id="search-form" action="../../search/"><input id="search-query" name="q" type="text" placeholder="Search docs"/></form><ul><li><a class="toctext" href="../../">Home</a></li><li><span class="toctext">Building Models</span><ul><li><a class="toctext" href="../basics/">Basics</a></li><li><a class="toctext" href="../recurrence/">Recurrence</a></li><li><a class="toctext" href="../regularisation/">Regularisation</a></li><li><a class="toctext" href="../layers/">Model Reference</a></li><li class="current"><a class="toctext" href>NNlib</a><ul class="internal"><li><a class="toctext" href="#Activation-Functions-1">Activation Functions</a></li><li><a class="toctext" href="#Softmax-1">Softmax</a></li><li><a class="toctext" href="#Pooling-1">Pooling</a></li><li><a class="toctext" href="#Convolution-1">Convolution</a></li></ul></li></ul></li><li><span class="toctext">Handling Data</span><ul><li><a class="toctext" href="../../data/onehot/">One-Hot Encoding</a></li><li><a class="toctext" href="../../data/dataloader/">DataLoader</a></li></ul></li><li><span class="toctext">Training Models</span><ul><li><a class="toctext" href="../../training/optimisers/">Optimisers</a></li><li><a class="toctext" href="../../training/training/">Training</a></li></ul></li><li><a class="toctext" href="../../gpu/">GPU Support</a></li><li><a class="toctext" href="../../saving/">Saving & Loading</a></li><li><a class="toctext" href="../../ecosystem/">The Julia Ecosystem</a></li><li><a class="toctext" href="../../performance/">Performance Tips</a></li><li><a class="toctext" href="../../community/">Community</a></li></ul></nav><article id="docs"><header><nav><ul><li>Building Models</li><li><a href>NNlib</a></li></ul><a class="edit-page" href="https://github.com/FluxML/Flux.jl/blob/master/docs/src/models/nnlib.md"><span class="fa"></span> Edit on GitHub</a></nav><hr/><div id="topbar"><span>NNlib</span><a class="fa fa-bars" href="#"></a></div></header><h1><a class="nav-anchor" id="NNlib-1" href="#NNlib-1">NNlib</a></h1><p>Flux re-exports all of the functions exported by the <a href="https://github.com/FluxML/NNlib.jl">NNlib</a> package.</p><h2><a class="nav-anchor" id="Activation-Functions-1" href="#Activation-Functions-1">Activation Functions</a></h2><p>Non-linearities that go between layers of your model. Note that, unless otherwise stated, activation functions operate on scalars. To apply them to an array you can call <code>σ.(xs)</code>, <code>relu.(xs)</code> and so on.</p><section class="docstring"><div class="docstring-header"><a class="docstring-binding" id="NNlib.elu" href="#NNlib.elu"><code>NNlib.elu</code></a> — <span class="docstring-category">Function</span>.</div><div><div><pre><code class="language-julia">elu(x, α = 1) =
|
||
x > 0 ? x : α * (exp(x) - 1)</code></pre><p>Exponential Linear Unit activation function. See <a href="https://arxiv.org/abs/1511.07289">Fast and Accurate Deep Network Learning by Exponential Linear Units</a>. You can also specify the coefficient explicitly, e.g. <code>elu(x, 1)</code>.</p></div></div></section><section class="docstring"><div class="docstring-header"><a class="docstring-binding" id="NNlib.gelu" href="#NNlib.gelu"><code>NNlib.gelu</code></a> — <span class="docstring-category">Function</span>.</div><div><div><pre><code class="language-julia">gelu(x) = 0.5x*(1 + tanh(√(2/π)*(x + 0.044715x^3)))</code></pre><p><a href="https://arxiv.org/pdf/1606.08415.pdf">Gaussian Error Linear Unit</a> activation function.</p></div></div></section><section class="docstring"><div class="docstring-header"><a class="docstring-binding" id="NNlib.leakyrelu" href="#NNlib.leakyrelu"><code>NNlib.leakyrelu</code></a> — <span class="docstring-category">Function</span>.</div><div><div><pre><code class="language-julia">leakyrelu(x) = max(0.01x, x)</code></pre><p>Leaky <a href="https://en.wikipedia.org/wiki/Rectifier_(neural_networks)">Rectified Linear Unit</a> activation function. You can also specify the coefficient explicitly, e.g. <code>leakyrelu(x, 0.01)</code>.</p></div></div></section><section class="docstring"><div class="docstring-header"><a class="docstring-binding" id="NNlib.logcosh" href="#NNlib.logcosh"><code>NNlib.logcosh</code></a> — <span class="docstring-category">Function</span>.</div><div><div><pre><code class="language-julia">logcosh(x)</code></pre><p>Return <code>log(cosh(x))</code> which is computed in a numerically stable way.</p></div></div></section><section class="docstring"><div class="docstring-header"><a class="docstring-binding" id="NNlib.logsigmoid" href="#NNlib.logsigmoid"><code>NNlib.logsigmoid</code></a> — <span class="docstring-category">Function</span>.</div><div><div><pre><code class="language-julia">logσ(x)</code></pre><p>Return <code>log(σ(x))</code> which is computed in a numerically stable way.</p><pre><code class="language-none">julia> logσ(0)
|
||
-0.6931471805599453
|
||
julia> logσ.([-100, -10, 100])
|
||
3-element Array{Float64,1}:
|
||
-100.0
|
||
-10.000045398899218
|
||
-3.720075976020836e-44</code></pre></div></div></section><section class="docstring"><div class="docstring-header"><a class="docstring-binding" id="NNlib.sigmoid" href="#NNlib.sigmoid"><code>NNlib.sigmoid</code></a> — <span class="docstring-category">Function</span>.</div><div><div><pre><code class="language-julia">σ(x) = 1 / (1 + exp(-x))</code></pre><p>Classic <a href="https://en.wikipedia.org/wiki/Sigmoid_function">sigmoid</a> activation function.</p></div></div></section><section class="docstring"><div class="docstring-header"><a class="docstring-binding" id="NNlib.relu" href="#NNlib.relu"><code>NNlib.relu</code></a> — <span class="docstring-category">Function</span>.</div><div><div><pre><code class="language-julia">relu(x) = max(0, x)</code></pre><p><a href="https://en.wikipedia.org/wiki/Rectifier_(neural_networks)">Rectified Linear Unit</a> activation function.</p></div></div></section><section class="docstring"><div class="docstring-header"><a class="docstring-binding" id="NNlib.selu" href="#NNlib.selu"><code>NNlib.selu</code></a> — <span class="docstring-category">Function</span>.</div><div><div><pre><code class="language-julia">selu(x) = λ * (x ≥ 0 ? x : α * (exp(x) - 1))
|
||
|
||
λ ≈ 1.0507
|
||
α ≈ 1.6733</code></pre><p>Scaled exponential linear units. See <a href="https://arxiv.org/pdf/1706.02515.pdf">Self-Normalizing Neural Networks</a>.</p></div></div></section><section class="docstring"><div class="docstring-header"><a class="docstring-binding" id="NNlib.softplus" href="#NNlib.softplus"><code>NNlib.softplus</code></a> — <span class="docstring-category">Function</span>.</div><div><div><pre><code class="language-julia">softplus(x) = log(exp(x) + 1)</code></pre><p>See <a href="http://proceedings.mlr.press/v15/glorot11a/glorot11a.pdf">Deep Sparse Rectifier Neural Networks</a>.</p></div></div></section><section class="docstring"><div class="docstring-header"><a class="docstring-binding" id="NNlib.softsign" href="#NNlib.softsign"><code>NNlib.softsign</code></a> — <span class="docstring-category">Function</span>.</div><div><div><pre><code class="language-julia">softsign(x) = x / (1 + |x|)</code></pre><p>See <a href="http://www.iro.umontreal.ca/~lisa/publications2/index.php/attachments/single/205">Quadratic Polynomials Learn Better Image Features</a>.</p></div></div></section><section class="docstring"><div class="docstring-header"><a class="docstring-binding" id="NNlib.swish" href="#NNlib.swish"><code>NNlib.swish</code></a> — <span class="docstring-category">Function</span>.</div><div><div><pre><code class="language-julia">swish(x) = x * σ(x)</code></pre><p>Self-gated activation function. See <a href="https://arxiv.org/pdf/1710.05941.pdf">Swish: a Self-Gated Activation Function</a>.</p></div></div></section><h2><a class="nav-anchor" id="Softmax-1" href="#Softmax-1">Softmax</a></h2><section class="docstring"><div class="docstring-header"><a class="docstring-binding" id="NNlib.softmax" href="#NNlib.softmax"><code>NNlib.softmax</code></a> — <span class="docstring-category">Function</span>.</div><div><div><pre><code class="language-julia">softmax(xs) = exp.(xs) ./ sum(exp.(xs))</code></pre><p><a href="https://en.wikipedia.org/wiki/Softmax_function">Softmax</a> takes log-probabilities (any real vector) and returns a probability distribution that sums to 1.</p><p>If given a matrix it will by default (<code>dims=1</code>) treat it as a batch of vectors, with each column independent. Keyword <code>dims=2</code> will instead treat rows independently, etc.</p><pre><code class="language-none">julia> softmax([1,2,3.])
|
||
3-element Array{Float64,1}:
|
||
0.0900306
|
||
0.244728
|
||
0.665241</code></pre></div></div></section><section class="docstring"><div class="docstring-header"><a class="docstring-binding" id="NNlib.logsoftmax" href="#NNlib.logsoftmax"><code>NNlib.logsoftmax</code></a> — <span class="docstring-category">Function</span>.</div><div><div><pre><code class="language-julia">logsoftmax(xs) = log.(exp.(xs) ./ sum(exp.(xs)))</code></pre><p>Computes the log of softmax in a more numerically stable way than directly taking <code>log.(softmax(xs))</code>. Commonly used in computing cross entropy loss.</p></div></div></section><h2><a class="nav-anchor" id="Pooling-1" href="#Pooling-1">Pooling</a></h2><div class="admonition warning"><div class="admonition-title">Missing docstring.</div><div class="admonition-text"><p>Missing docstring for <code>NNlib.maxpool</code>. Check Documenter's build log for details.</p></div></div><div class="admonition warning"><div class="admonition-title">Missing docstring.</div><div class="admonition-text"><p>Missing docstring for <code>NNlib.meanpool</code>. Check Documenter's build log for details.</p></div></div><h2><a class="nav-anchor" id="Convolution-1" href="#Convolution-1">Convolution</a></h2><div class="admonition warning"><div class="admonition-title">Missing docstring.</div><div class="admonition-text"><p>Missing docstring for <code>NNlib.conv</code>. Check Documenter's build log for details.</p></div></div><div class="admonition warning"><div class="admonition-title">Missing docstring.</div><div class="admonition-text"><p>Missing docstring for <code>NNlib.depthwiseconv</code>. Check Documenter's build log for details.</p></div></div><footer><hr/><a class="previous" href="../layers/"><span class="direction">Previous</span><span class="title">Model Reference</span></a><a class="next" href="../../data/onehot/"><span class="direction">Next</span><span class="title">One-Hot Encoding</span></a></footer></article></body></html>
|