Fix typo in the docstrings of AlphaDropout

This commit is contained in:
AzamatB 2020-03-14 15:42:00 +06:00 committed by GitHub
parent 5e09113586
commit 85a9493722
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
1 changed files with 1 additions and 1 deletions

View File

@ -69,7 +69,7 @@ A dropout layer. It is used in Self-Normalizing Neural Networks.
(https://papers.nips.cc/paper/6698-self-normalizing-neural-networks.pdf)
The AlphaDropout layer ensures that mean and variance of activations remains the same as before.
Does nothing to the input once [`testmode!`](@ref) is false.
Does nothing to the input once [`testmode!`](@ref) is true.
"""
mutable struct AlphaDropout{F}
p::F