rename crossentropy loss

This commit is contained in:
Mike J Innes 2017-10-17 17:36:18 +01:00
parent 1800c8f523
commit 6dff8ca8d3
3 changed files with 5 additions and 3 deletions

View File

@ -30,7 +30,7 @@ loss(x, y) = Flux.mse(m(x), y)
Flux.train!(loss, data, opt)
```
The loss will almost always be defined in terms of some *cost function* that measures the distance of the prediction `m(x)` from the target `y`. Flux has several of these built in, like `mse` for mean squared error or `logloss` for cross entropy loss, but you can calculate it however you want.
The loss will almost always be defined in terms of some *cost function* that measures the distance of the prediction `m(x)` from the target `y`. Flux has several of these built in, like `mse` for mean squared error or `crossentropy` for cross entropy loss, but you can calculate it however you want.
## Datasets

View File

@ -2,5 +2,7 @@
mse(, y) = sum(( .- y).^2)/length(y)
logloss(::AbstractVecOrMat, y::AbstractVecOrMat) =
crossentropy(::AbstractVecOrMat, y::AbstractVecOrMat) =
-sum(y .* log.()) / size(y, 2)
@deprecate logloss(x, y) crossentropy(x, y)

View File

@ -15,7 +15,7 @@ gradtest(f, dims...) = gradtest(f, rand.(dims)...)
@test gradtest(x -> softmax(x).*(1:3), (3,5))
@test gradtest(Flux.mse, rand(5,5), rand(5, 5))
@test gradtest(Flux.logloss, rand(5,5), rand(5, 5))
@test gradtest(Flux.crossentropy, rand(5,5), rand(5, 5))
@test gradtest(x -> x', rand(5))