diff --git a/latest/contributing.html b/latest/contributing.html index 2ee0dba5..cc76f763 100644 --- a/latest/contributing.html +++ b/latest/contributing.html @@ -104,7 +104,7 @@ Contributing & Help - + diff --git a/latest/examples/logreg.html b/latest/examples/logreg.html index 2a320fa5..b6ac00c4 100644 --- a/latest/examples/logreg.html +++ b/latest/examples/logreg.html @@ -107,7 +107,7 @@ Logistic Regression - + diff --git a/latest/index.html b/latest/index.html index dc815f77..d4b59afb 100644 --- a/latest/index.html +++ b/latest/index.html @@ -110,7 +110,7 @@ Home - + diff --git a/latest/internals.html b/latest/internals.html index a292efab..4a854111 100644 --- a/latest/internals.html +++ b/latest/internals.html @@ -104,7 +104,7 @@ Internals - + diff --git a/latest/models/basics.html b/latest/models/basics.html index 45c898f2..663988b8 100644 --- a/latest/models/basics.html +++ b/latest/models/basics.html @@ -128,7 +128,7 @@ Model Building Basics - + @@ -393,7 +393,7 @@ You may recognise this as being equivalent to

Chain(
   Affine(10, 20), σ
-  Affine(20, 15)), softmax
+ Affine(20, 15), softmax)

given that it's just a sequence of calls. For simple networks Chain diff --git a/latest/models/debugging.html b/latest/models/debugging.html index d1210a96..c309513e 100644 --- a/latest/models/debugging.html +++ b/latest/models/debugging.html @@ -107,7 +107,7 @@ Debugging - + diff --git a/latest/models/recurrent.html b/latest/models/recurrent.html index ed83912d..ba5ccfef 100644 --- a/latest/models/recurrent.html +++ b/latest/models/recurrent.html @@ -107,7 +107,7 @@ Recurrence - + diff --git a/latest/search_index.js b/latest/search_index.js index 39fc8930..4b6c9a8d 100644 --- a/latest/search_index.js +++ b/latest/search_index.js @@ -77,7 +77,7 @@ var documenterSearchIndex = {"docs": [ "page": "Model Building Basics", "title": "Sub-Templates", "category": "section", - "text": "@net models can contain sub-models as well as just array parameters:@net type TLP\n first\n second\n function (x)\n l1 = σ(first(x))\n l2 = softmax(second(l1))\n end\nendJust as above, this is roughly equivalent to writing:type TLP\n first\n second\nend\n\nfunction (self::TLP)(x)\n l1 = σ(self.first)\n l2 = softmax(self.second(l1))\nendClearly, the first and second parameters are not arrays here, but should be models themselves, and produce a result when called with an input array x. The Affine layer fits the bill so we can instantiate TLP with two of them:model = TLP(Affine(10, 20),\n Affine(20, 15))\nx1 = rand(20)\nmodel(x1) # [0.057852,0.0409741,0.0609625,0.0575354 ...You may recognise this as being equivalent toChain(\n Affine(10, 20), σ\n Affine(20, 15)), softmaxgiven that it's just a sequence of calls. For simple networks Chain is completely fine, although the @net version is more powerful as we can (for example) reuse the output l1 more than once." + "text": "@net models can contain sub-models as well as just array parameters:@net type TLP\n first\n second\n function (x)\n l1 = σ(first(x))\n l2 = softmax(second(l1))\n end\nendJust as above, this is roughly equivalent to writing:type TLP\n first\n second\nend\n\nfunction (self::TLP)(x)\n l1 = σ(self.first)\n l2 = softmax(self.second(l1))\nendClearly, the first and second parameters are not arrays here, but should be models themselves, and produce a result when called with an input array x. The Affine layer fits the bill so we can instantiate TLP with two of them:model = TLP(Affine(10, 20),\n Affine(20, 15))\nx1 = rand(20)\nmodel(x1) # [0.057852,0.0409741,0.0609625,0.0575354 ...You may recognise this as being equivalent toChain(\n Affine(10, 20), σ\n Affine(20, 15), softmax)given that it's just a sequence of calls. For simple networks Chain is completely fine, although the @net version is more powerful as we can (for example) reuse the output l1 more than once." }, {