build based on f8a18ef
This commit is contained in:
parent
c0e1497b26
commit
269f750375
@ -104,7 +104,7 @@ Contributing & Help
|
||||
</a>
|
||||
</li>
|
||||
</ul>
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/2ae4a23efe49056eb55914c2984478747eb002c8/docs/src/contributing.md">
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/f8a18ef3c5c5ec9801b3aca1c321eaea07a5c004/docs/src/contributing.md">
|
||||
<span class="fa">
|
||||
|
||||
</span>
|
||||
|
@ -107,7 +107,7 @@ Logistic Regression
|
||||
</a>
|
||||
</li>
|
||||
</ul>
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/2ae4a23efe49056eb55914c2984478747eb002c8/docs/src/examples/logreg.md">
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/f8a18ef3c5c5ec9801b3aca1c321eaea07a5c004/docs/src/examples/logreg.md">
|
||||
<span class="fa">
|
||||
|
||||
</span>
|
||||
|
@ -110,7 +110,7 @@ Home
|
||||
</a>
|
||||
</li>
|
||||
</ul>
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/2ae4a23efe49056eb55914c2984478747eb002c8/docs/src/index.md">
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/f8a18ef3c5c5ec9801b3aca1c321eaea07a5c004/docs/src/index.md">
|
||||
<span class="fa">
|
||||
|
||||
</span>
|
||||
|
@ -104,7 +104,7 @@ Internals
|
||||
</a>
|
||||
</li>
|
||||
</ul>
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/2ae4a23efe49056eb55914c2984478747eb002c8/docs/src/internals.md">
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/f8a18ef3c5c5ec9801b3aca1c321eaea07a5c004/docs/src/internals.md">
|
||||
<span class="fa">
|
||||
|
||||
</span>
|
||||
|
@ -128,7 +128,7 @@ Model Building Basics
|
||||
</a>
|
||||
</li>
|
||||
</ul>
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/2ae4a23efe49056eb55914c2984478747eb002c8/docs/src/models/basics.md">
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/f8a18ef3c5c5ec9801b3aca1c321eaea07a5c004/docs/src/models/basics.md">
|
||||
<span class="fa">
|
||||
|
||||
</span>
|
||||
@ -393,7 +393,7 @@ You may recognise this as being equivalent to
|
||||
</p>
|
||||
<pre><code class="language-julia">Chain(
|
||||
Affine(10, 20), σ
|
||||
Affine(20, 15)), softmax</code></pre>
|
||||
Affine(20, 15), softmax)</code></pre>
|
||||
<p>
|
||||
given that it's just a sequence of calls. For simple networks
|
||||
<code>Chain</code>
|
||||
|
@ -107,7 +107,7 @@ Debugging
|
||||
</a>
|
||||
</li>
|
||||
</ul>
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/2ae4a23efe49056eb55914c2984478747eb002c8/docs/src/models/debugging.md">
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/f8a18ef3c5c5ec9801b3aca1c321eaea07a5c004/docs/src/models/debugging.md">
|
||||
<span class="fa">
|
||||
|
||||
</span>
|
||||
|
@ -107,7 +107,7 @@ Recurrence
|
||||
</a>
|
||||
</li>
|
||||
</ul>
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/2ae4a23efe49056eb55914c2984478747eb002c8/docs/src/models/recurrent.md">
|
||||
<a class="edit-page" href="https://github.com/MikeInnes/Flux.jl/tree/f8a18ef3c5c5ec9801b3aca1c321eaea07a5c004/docs/src/models/recurrent.md">
|
||||
<span class="fa">
|
||||
|
||||
</span>
|
||||
|
@ -77,7 +77,7 @@ var documenterSearchIndex = {"docs": [
|
||||
"page": "Model Building Basics",
|
||||
"title": "Sub-Templates",
|
||||
"category": "section",
|
||||
"text": "@net models can contain sub-models as well as just array parameters:@net type TLP\n first\n second\n function (x)\n l1 = σ(first(x))\n l2 = softmax(second(l1))\n end\nendJust as above, this is roughly equivalent to writing:type TLP\n first\n second\nend\n\nfunction (self::TLP)(x)\n l1 = σ(self.first)\n l2 = softmax(self.second(l1))\nendClearly, the first and second parameters are not arrays here, but should be models themselves, and produce a result when called with an input array x. The Affine layer fits the bill so we can instantiate TLP with two of them:model = TLP(Affine(10, 20),\n Affine(20, 15))\nx1 = rand(20)\nmodel(x1) # [0.057852,0.0409741,0.0609625,0.0575354 ...You may recognise this as being equivalent toChain(\n Affine(10, 20), σ\n Affine(20, 15)), softmaxgiven that it's just a sequence of calls. For simple networks Chain is completely fine, although the @net version is more powerful as we can (for example) reuse the output l1 more than once."
|
||||
"text": "@net models can contain sub-models as well as just array parameters:@net type TLP\n first\n second\n function (x)\n l1 = σ(first(x))\n l2 = softmax(second(l1))\n end\nendJust as above, this is roughly equivalent to writing:type TLP\n first\n second\nend\n\nfunction (self::TLP)(x)\n l1 = σ(self.first)\n l2 = softmax(self.second(l1))\nendClearly, the first and second parameters are not arrays here, but should be models themselves, and produce a result when called with an input array x. The Affine layer fits the bill so we can instantiate TLP with two of them:model = TLP(Affine(10, 20),\n Affine(20, 15))\nx1 = rand(20)\nmodel(x1) # [0.057852,0.0409741,0.0609625,0.0575354 ...You may recognise this as being equivalent toChain(\n Affine(10, 20), σ\n Affine(20, 15), softmax)given that it's just a sequence of calls. For simple networks Chain is completely fine, although the @net version is more powerful as we can (for example) reuse the output l1 more than once."
|
||||
},
|
||||
|
||||
{
|
||||
|
Loading…
Reference in New Issue
Block a user