diff --git a/docs/src/training/optimisers.md b/docs/src/training/optimisers.md index 5ed083ee..25cfd3ce 100644 --- a/docs/src/training/optimisers.md +++ b/docs/src/training/optimisers.md @@ -140,3 +140,16 @@ ExpDecay InvDecay WeightDecay ``` + +## Gradient Clipping + +Gradient clipping is useful for training recurrent neural networks, which have a tendency to suffer from the exploding gradient problem. An example usage is + +```julia +opt = Optimiser(ClipValue(1e-3), ADAM(1e-3)) +``` + +```@docs +ClipValue +ClipNorm +``` \ No newline at end of file