diff --git a/src/layers/stateless.jl b/src/layers/stateless.jl index 424db1df..aaefcee9 100644 --- a/src/layers/stateless.jl +++ b/src/layers/stateless.jl @@ -59,6 +59,7 @@ end Kullback Leibler Divergence(KL Divergence) KLDivergence is a measure of how much one probability distribution is different from the other. It is always non-negative and zero only when both the distributions are equal everywhere. +https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence """ function kldivergence(ŷ, y) entropy = sum(y .* log.(y)) *1 //size(y,2) @@ -69,6 +70,7 @@ end """ Poisson Loss function Poisson loss function is a measure of how the predicted distribution diverges from the expected distribution. +https://isaacchanghau.github.io/post/loss_functions/ """ poisson(ŷ, y) = sum(ŷ .- y .* log.(ŷ)) *1 // size(y,2)