From 6f078857beda49e7f1d565cc7e4dded6c55db3d0 Mon Sep 17 00:00:00 2001 From: thebhatman Date: Tue, 26 Mar 2019 03:15:28 +0530 Subject: [PATCH] Added reference links to loss functions --- src/layers/stateless.jl | 2 ++ 1 file changed, 2 insertions(+) diff --git a/src/layers/stateless.jl b/src/layers/stateless.jl index 424db1df..aaefcee9 100644 --- a/src/layers/stateless.jl +++ b/src/layers/stateless.jl @@ -59,6 +59,7 @@ end Kullback Leibler Divergence(KL Divergence) KLDivergence is a measure of how much one probability distribution is different from the other. It is always non-negative and zero only when both the distributions are equal everywhere. +https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence """ function kldivergence(ŷ, y) entropy = sum(y .* log.(y)) *1 //size(y,2) @@ -69,6 +70,7 @@ end """ Poisson Loss function Poisson loss function is a measure of how the predicted distribution diverges from the expected distribution. +https://isaacchanghau.github.io/post/loss_functions/ """ poisson(ŷ, y) = sum(ŷ .- y .* log.(ŷ)) *1 // size(y,2)