Updated activation functions in NNlib doc

This commit is contained in:
Adarshkumar712 2020-03-03 22:07:05 +05:30
parent 4acc907723
commit d0e8a9ff71
1 changed files with 11 additions and 1 deletions

View File

@ -7,17 +7,27 @@ Flux re-exports all of the functions exported by the [NNlib](https://github.com/
Non-linearities that go between layers of your model. Note that, unless otherwise stated, activation functions operate on scalars. To apply them to an array you can call `σ.(xs)`, `relu.(xs)` and so on.
```@docs
NNlib.celu
NNlib.elu
NNlib.gelu
NNlib.hardsigmoid
NNlib.hardtanh
NNlib.leakyrelu
NNlib.lisht
NNlib.logcosh
NNlib.logsigmoid
NNlib.mish
NNlib.relu
NNlib.relu6
NNlib.rrelu
NNlib.selu
NNlib.sigmoid
NNlib.softplus
NNlib.softshrink
NNlib.softsign
NNlib.swish
NNlib.tanhshrink
NNlib.trelu
```
## Softmax