Added reference links to loss functions
This commit is contained in:
parent
c4d12e57fe
commit
6f078857be
@ -59,6 +59,7 @@ end
|
|||||||
Kullback Leibler Divergence(KL Divergence)
|
Kullback Leibler Divergence(KL Divergence)
|
||||||
KLDivergence is a measure of how much one probability distribution is different from the other.
|
KLDivergence is a measure of how much one probability distribution is different from the other.
|
||||||
It is always non-negative and zero only when both the distributions are equal everywhere.
|
It is always non-negative and zero only when both the distributions are equal everywhere.
|
||||||
|
https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence
|
||||||
"""
|
"""
|
||||||
function kldivergence(ŷ, y)
|
function kldivergence(ŷ, y)
|
||||||
entropy = sum(y .* log.(y)) *1 //size(y,2)
|
entropy = sum(y .* log.(y)) *1 //size(y,2)
|
||||||
@ -69,6 +70,7 @@ end
|
|||||||
"""
|
"""
|
||||||
Poisson Loss function
|
Poisson Loss function
|
||||||
Poisson loss function is a measure of how the predicted distribution diverges from the expected distribution.
|
Poisson loss function is a measure of how the predicted distribution diverges from the expected distribution.
|
||||||
|
https://isaacchanghau.github.io/post/loss_functions/
|
||||||
"""
|
"""
|
||||||
poisson(ŷ, y) = sum(ŷ .- y .* log.(ŷ)) *1 // size(y,2)
|
poisson(ŷ, y) = sum(ŷ .- y .* log.(ŷ)) *1 // size(y,2)
|
||||||
|
|
||||||
|
Loading…
Reference in New Issue
Block a user