add 'using Flux: crossentropy'

Following the suggestion from MikeInnes to use 'using Flux: crossentropy' instead 'Flux.crossentropy'
This commit is contained in:
kleskjr 2018-06-05 14:30:14 +02:00 committed by GitHub
parent 3a73902379
commit dd3af0c9f7
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
1 changed files with 4 additions and 3 deletions

View File

@ -7,15 +7,16 @@ add the result to the overall loss.
For example, say we have a simple regression.
```julia
using Flux: crossentropy
m = Dense(10, 5)
loss(x, y) = Flux.crossentropy(softmax(m(x)), y)
loss(x, y) = crossentropy(softmax(m(x)), y)
```
We can regularise this by taking the (L2) norm of the parameters, `m.W` and `m.b`.
```julia
penalty() = vecnorm(m.W) + vecnorm(m.b)
loss(x, y) = Flux.crossentropy(softmax(m(x)), y) + penalty()
loss(x, y) = crossentropy(softmax(m(x)), y) + penalty()
```
When working with layers, Flux provides the `params` function to grab all
@ -39,7 +40,7 @@ m = Chain(
Dense(128, 32, relu),
Dense(32, 10), softmax)
loss(x, y) = Flux.crossentropy(m(x), y) + sum(vecnorm, params(m))
loss(x, y) = crossentropy(m(x), y) + sum(vecnorm, params(m))
loss(rand(28^2), rand(10))
```