From 3a739023793f2036c8f08910619419f0efb29bd7 Mon Sep 17 00:00:00 2001 From: kleskjr Date: Fri, 25 May 2018 13:54:17 +0200 Subject: [PATCH 1/2] Solves Issue #262 Makes the running of the basic examples smoother (Issue #262). crossentropy is not exported by Flus so, either explicit reference or explicit export should be add to run the examples. --- docs/src/models/regularisation.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/docs/src/models/regularisation.md b/docs/src/models/regularisation.md index d4325a53..6abbcada 100644 --- a/docs/src/models/regularisation.md +++ b/docs/src/models/regularisation.md @@ -8,14 +8,14 @@ For example, say we have a simple regression. ```julia m = Dense(10, 5) -loss(x, y) = crossentropy(softmax(m(x)), y) +loss(x, y) = Flux.crossentropy(softmax(m(x)), y) ``` We can regularise this by taking the (L2) norm of the parameters, `m.W` and `m.b`. ```julia penalty() = vecnorm(m.W) + vecnorm(m.b) -loss(x, y) = crossentropy(softmax(m(x)), y) + penalty() +loss(x, y) = Flux.crossentropy(softmax(m(x)), y) + penalty() ``` When working with layers, Flux provides the `params` function to grab all @@ -39,7 +39,7 @@ m = Chain( Dense(128, 32, relu), Dense(32, 10), softmax) -loss(x, y) = crossentropy(m(x), y) + sum(vecnorm, params(m)) +loss(x, y) = Flux.crossentropy(m(x), y) + sum(vecnorm, params(m)) loss(rand(28^2), rand(10)) ``` From dd3af0c9f7a2fed73f6918ee92bebaee0640d942 Mon Sep 17 00:00:00 2001 From: kleskjr Date: Tue, 5 Jun 2018 14:30:14 +0200 Subject: [PATCH 2/2] add 'using Flux: crossentropy' Following the suggestion from MikeInnes to use 'using Flux: crossentropy' instead 'Flux.crossentropy' --- docs/src/models/regularisation.md | 7 ++++--- 1 file changed, 4 insertions(+), 3 deletions(-) diff --git a/docs/src/models/regularisation.md b/docs/src/models/regularisation.md index 6abbcada..70d06348 100644 --- a/docs/src/models/regularisation.md +++ b/docs/src/models/regularisation.md @@ -7,15 +7,16 @@ add the result to the overall loss. For example, say we have a simple regression. ```julia +using Flux: crossentropy m = Dense(10, 5) -loss(x, y) = Flux.crossentropy(softmax(m(x)), y) +loss(x, y) = crossentropy(softmax(m(x)), y) ``` We can regularise this by taking the (L2) norm of the parameters, `m.W` and `m.b`. ```julia penalty() = vecnorm(m.W) + vecnorm(m.b) -loss(x, y) = Flux.crossentropy(softmax(m(x)), y) + penalty() +loss(x, y) = crossentropy(softmax(m(x)), y) + penalty() ``` When working with layers, Flux provides the `params` function to grab all @@ -39,7 +40,7 @@ m = Chain( Dense(128, 32, relu), Dense(32, 10), softmax) -loss(x, y) = Flux.crossentropy(m(x), y) + sum(vecnorm, params(m)) +loss(x, y) = crossentropy(m(x), y) + sum(vecnorm, params(m)) loss(rand(28^2), rand(10)) ```