rename crossentropy loss
This commit is contained in:
parent
1800c8f523
commit
6dff8ca8d3
|
@ -30,7 +30,7 @@ loss(x, y) = Flux.mse(m(x), y)
|
|||
Flux.train!(loss, data, opt)
|
||||
```
|
||||
|
||||
The loss will almost always be defined in terms of some *cost function* that measures the distance of the prediction `m(x)` from the target `y`. Flux has several of these built in, like `mse` for mean squared error or `logloss` for cross entropy loss, but you can calculate it however you want.
|
||||
The loss will almost always be defined in terms of some *cost function* that measures the distance of the prediction `m(x)` from the target `y`. Flux has several of these built in, like `mse` for mean squared error or `crossentropy` for cross entropy loss, but you can calculate it however you want.
|
||||
|
||||
## Datasets
|
||||
|
||||
|
|
|
@ -2,5 +2,7 @@
|
|||
|
||||
mse(ŷ, y) = sum((ŷ .- y).^2)/length(y)
|
||||
|
||||
logloss(ŷ::AbstractVecOrMat, y::AbstractVecOrMat) =
|
||||
crossentropy(ŷ::AbstractVecOrMat, y::AbstractVecOrMat) =
|
||||
-sum(y .* log.(ŷ)) / size(y, 2)
|
||||
|
||||
@deprecate logloss(x, y) crossentropy(x, y)
|
||||
|
|
|
@ -15,7 +15,7 @@ gradtest(f, dims...) = gradtest(f, rand.(dims)...)
|
|||
@test gradtest(x -> softmax(x).*(1:3), (3,5))
|
||||
|
||||
@test gradtest(Flux.mse, rand(5,5), rand(5, 5))
|
||||
@test gradtest(Flux.logloss, rand(5,5), rand(5, 5))
|
||||
@test gradtest(Flux.crossentropy, rand(5,5), rand(5, 5))
|
||||
|
||||
@test gradtest(x -> x', rand(5))
|
||||
|
||||
|
|
Loading…
Reference in New Issue