Updated activation functions in NNlib doc

This commit is contained in:
Adarshkumar712 2020-03-03 22:07:05 +05:30
parent 4acc907723
commit d0e8a9ff71
1 changed files with 11 additions and 1 deletions

View File

@ -7,17 +7,27 @@ Flux re-exports all of the functions exported by the [NNlib](https://github.com/
Non-linearities that go between layers of your model. Note that, unless otherwise stated, activation functions operate on scalars. To apply them to an array you can call `σ.(xs)`, `relu.(xs)` and so on. Non-linearities that go between layers of your model. Note that, unless otherwise stated, activation functions operate on scalars. To apply them to an array you can call `σ.(xs)`, `relu.(xs)` and so on.
```@docs ```@docs
NNlib.celu
NNlib.elu NNlib.elu
NNlib.gelu NNlib.gelu
NNlib.hardsigmoid
NNlib.hardtanh
NNlib.leakyrelu NNlib.leakyrelu
NNlib.lisht
NNlib.logcosh NNlib.logcosh
NNlib.logsigmoid NNlib.logsigmoid
NNlib.mish
NNlib.relu NNlib.relu
NNlib.relu6
NNlib.rrelu
NNlib.selu NNlib.selu
NNlib.sigmoid NNlib.sigmoid
NNlib.softplus NNlib.softplus
NNlib.softshrink
NNlib.softsign NNlib.softsign
NNlib.swish NNlib.swish
NNlib.tanhshrink
NNlib.trelu
``` ```
## Softmax ## Softmax
@ -48,4 +58,4 @@ NNlib.batched_mul
NNlib.batched_mul! NNlib.batched_mul!
NNlib.batched_adjoint NNlib.batched_adjoint
NNlib.batched_transpose NNlib.batched_transpose
``` ```