Changes to docs

This commit is contained in:
thebhatman 2019-10-09 14:53:03 +05:30
parent ec886c8ce8
commit 96a23c295c
2 changed files with 8 additions and 7 deletions

View File

@ -66,7 +66,7 @@ LayerNorm
GroupNorm
```
## In-built loss functions:
## Loss functions:
```@docs
mse
crossentropy

View File

@ -51,10 +51,10 @@ function normalise(x::AbstractArray; dims=1)
end
"""
Kullback Leibler Divergence(KL Divergence)
kldivergence(, y)
KLDivergence is a measure of how much one probability distribution is different from the other.
It is always non-negative and zero only when both the distributions are equal everywhere.
https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence
[KL Divergence](https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence).
"""
function kldivergence(, y)
entropy = sum(y .* log.(y)) *1 //size(y,2)
@ -63,14 +63,15 @@ function kldivergence(ŷ, y)
end
"""
Poisson Loss function
poisson(, y)
Poisson loss function is a measure of how the predicted distribution diverges from the expected distribution.
https://isaacchanghau.github.io/post/loss_functions/
[Poisson Loss](https://peltarion.com/knowledge-center/documentation/modeling-view/build-an-ai-model/loss-functions/poisson).
"""
poisson(, y) = sum( .- y .* log.()) *1 // size(y,2)
"""
Hinge Loss function
Measures the loss given the prediction and true labels y(containing 1 or -1). This is usually used for measuring whether two inputs are similar or dissimilar
hinge(, y)
Measures the loss given the prediction and true labels y(containing 1 or -1).
[Hinge Loss](https://en.wikipedia.org/wiki/Hinge_loss).
"""
hinge(, y) = sum(max.(0, 1 .- .* y)) *1 // size(y,2)