Solves Issue #262

Makes  the running of the basic examples smoother (Issue #262).
crossentropy is not exported by Flus so, either explicit reference or explicit export should be add to run the examples.
This commit is contained in:
kleskjr 2018-05-25 13:54:17 +02:00 committed by GitHub
parent e92f840510
commit 3a73902379
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
1 changed files with 3 additions and 3 deletions

View File

@ -8,14 +8,14 @@ For example, say we have a simple regression.
```julia
m = Dense(10, 5)
loss(x, y) = crossentropy(softmax(m(x)), y)
loss(x, y) = Flux.crossentropy(softmax(m(x)), y)
```
We can regularise this by taking the (L2) norm of the parameters, `m.W` and `m.b`.
```julia
penalty() = vecnorm(m.W) + vecnorm(m.b)
loss(x, y) = crossentropy(softmax(m(x)), y) + penalty()
loss(x, y) = Flux.crossentropy(softmax(m(x)), y) + penalty()
```
When working with layers, Flux provides the `params` function to grab all
@ -39,7 +39,7 @@ m = Chain(
Dense(128, 32, relu),
Dense(32, 10), softmax)
loss(x, y) = crossentropy(m(x), y) + sum(vecnorm, params(m))
loss(x, y) = Flux.crossentropy(m(x), y) + sum(vecnorm, params(m))
loss(rand(28^2), rand(10))
```