From d0e8a9ff71fabc4a32d6606908236f79e44ca4a2 Mon Sep 17 00:00:00 2001 From: Adarshkumar712 Date: Tue, 3 Mar 2020 22:07:05 +0530 Subject: [PATCH] Updated activation functions in NNlib doc --- docs/src/models/nnlib.md | 12 +++++++++++- 1 file changed, 11 insertions(+), 1 deletion(-) diff --git a/docs/src/models/nnlib.md b/docs/src/models/nnlib.md index 6dbfd4f4..7ede2682 100644 --- a/docs/src/models/nnlib.md +++ b/docs/src/models/nnlib.md @@ -7,17 +7,27 @@ Flux re-exports all of the functions exported by the [NNlib](https://github.com/ Non-linearities that go between layers of your model. Note that, unless otherwise stated, activation functions operate on scalars. To apply them to an array you can call `σ.(xs)`, `relu.(xs)` and so on. ```@docs +NNlib.celu NNlib.elu NNlib.gelu +NNlib.hardsigmoid +NNlib.hardtanh NNlib.leakyrelu +NNlib.lisht NNlib.logcosh NNlib.logsigmoid +NNlib.mish NNlib.relu +NNlib.relu6 +NNlib.rrelu NNlib.selu NNlib.sigmoid NNlib.softplus +NNlib.softshrink NNlib.softsign NNlib.swish +NNlib.tanhshrink +NNlib.trelu ``` ## Softmax @@ -48,4 +58,4 @@ NNlib.batched_mul NNlib.batched_mul! NNlib.batched_adjoint NNlib.batched_transpose -``` \ No newline at end of file +```