learning rate as default arg

This commit is contained in:
Mike J Innes 2017-10-18 17:07:49 +01:00
parent e82428bb83
commit 07ad7cfa40

View File

@ -43,7 +43,7 @@ Nesterov(ps, ρ; decay = 0) =
optimiser. Parameters other than learning rate don't need tuning. Often a good
choice for recurrent networks.
"""
RMSProp(ps; η = 0.001, ρ = 0.9, ϵ = 1e-8, decay = 0) =
RMSProp(ps, η = 0.001; ρ = 0.9, ϵ = 1e-8, decay = 0) =
optimiser(ps, p -> rmsprop(p; η = η, ρ = ρ, ϵ = ϵ), p -> invdecay(p, decay), p -> descent(p, 1))
"""