From 85a9493722442ad77ec575b57600cc4328f36c38 Mon Sep 17 00:00:00 2001 From: AzamatB Date: Sat, 14 Mar 2020 15:42:00 +0600 Subject: [PATCH] Fix typo in the docstrings of AlphaDropout --- src/layers/normalise.jl | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/layers/normalise.jl b/src/layers/normalise.jl index 163dac12..3828748f 100644 --- a/src/layers/normalise.jl +++ b/src/layers/normalise.jl @@ -69,7 +69,7 @@ A dropout layer. It is used in Self-Normalizing Neural Networks. (https://papers.nips.cc/paper/6698-self-normalizing-neural-networks.pdf) The AlphaDropout layer ensures that mean and variance of activations remains the same as before. -Does nothing to the input once [`testmode!`](@ref) is false. +Does nothing to the input once [`testmode!`](@ref) is true. """ mutable struct AlphaDropout{F} p::F