Added reference links to loss functions

This commit is contained in:
thebhatman 2019-03-26 03:15:28 +05:30
parent c4d12e57fe
commit 6f078857be

View File

@ -59,6 +59,7 @@ end
Kullback Leibler Divergence(KL Divergence)
KLDivergence is a measure of how much one probability distribution is different from the other.
It is always non-negative and zero only when both the distributions are equal everywhere.
https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence
"""
function kldivergence(, y)
entropy = sum(y .* log.(y)) *1 //size(y,2)
@ -69,6 +70,7 @@ end
"""
Poisson Loss function
Poisson loss function is a measure of how the predicted distribution diverges from the expected distribution.
https://isaacchanghau.github.io/post/loss_functions/
"""
poisson(, y) = sum( .- y .* log.()) *1 // size(y,2)