From 269f750375b26aca937aa7832d8692d0d7b3c266 Mon Sep 17 00:00:00 2001
From: autodocs
Chain(
Affine(10, 20), σ
- Affine(20, 15)), softmax
+ Affine(20, 15), softmax)
given that it's just a sequence of calls. For simple networks
Chain
diff --git a/latest/models/debugging.html b/latest/models/debugging.html
index d1210a96..c309513e 100644
--- a/latest/models/debugging.html
+++ b/latest/models/debugging.html
@@ -107,7 +107,7 @@ Debugging
-
+
diff --git a/latest/models/recurrent.html b/latest/models/recurrent.html
index ed83912d..ba5ccfef 100644
--- a/latest/models/recurrent.html
+++ b/latest/models/recurrent.html
@@ -107,7 +107,7 @@ Recurrence
-
+
diff --git a/latest/search_index.js b/latest/search_index.js
index 39fc8930..4b6c9a8d 100644
--- a/latest/search_index.js
+++ b/latest/search_index.js
@@ -77,7 +77,7 @@ var documenterSearchIndex = {"docs": [
"page": "Model Building Basics",
"title": "Sub-Templates",
"category": "section",
- "text": "@net models can contain sub-models as well as just array parameters:@net type TLP\n first\n second\n function (x)\n l1 = σ(first(x))\n l2 = softmax(second(l1))\n end\nendJust as above, this is roughly equivalent to writing:type TLP\n first\n second\nend\n\nfunction (self::TLP)(x)\n l1 = σ(self.first)\n l2 = softmax(self.second(l1))\nendClearly, the first and second parameters are not arrays here, but should be models themselves, and produce a result when called with an input array x. The Affine layer fits the bill so we can instantiate TLP with two of them:model = TLP(Affine(10, 20),\n Affine(20, 15))\nx1 = rand(20)\nmodel(x1) # [0.057852,0.0409741,0.0609625,0.0575354 ...You may recognise this as being equivalent toChain(\n Affine(10, 20), σ\n Affine(20, 15)), softmaxgiven that it's just a sequence of calls. For simple networks Chain is completely fine, although the @net version is more powerful as we can (for example) reuse the output l1 more than once."
+ "text": "@net models can contain sub-models as well as just array parameters:@net type TLP\n first\n second\n function (x)\n l1 = σ(first(x))\n l2 = softmax(second(l1))\n end\nendJust as above, this is roughly equivalent to writing:type TLP\n first\n second\nend\n\nfunction (self::TLP)(x)\n l1 = σ(self.first)\n l2 = softmax(self.second(l1))\nendClearly, the first and second parameters are not arrays here, but should be models themselves, and produce a result when called with an input array x. The Affine layer fits the bill so we can instantiate TLP with two of them:model = TLP(Affine(10, 20),\n Affine(20, 15))\nx1 = rand(20)\nmodel(x1) # [0.057852,0.0409741,0.0609625,0.0575354 ...You may recognise this as being equivalent toChain(\n Affine(10, 20), σ\n Affine(20, 15), softmax)given that it's just a sequence of calls. For simple networks Chain is completely fine, although the @net version is more powerful as we can (for example) reuse the output l1 more than once."
},
{