1069: Updated activation functions in NNlib doc r=dhairyagandhi96 a=AdarshKumar712



Co-authored-by: Adarshkumar712 <Adarshkumar712.ak@gmail.com>
This commit is contained in:
bors[bot] 2020-03-03 12:39:03 +00:00 committed by GitHub
commit 19a034b215
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
1 changed files with 11 additions and 1 deletions

View File

@ -7,17 +7,27 @@ Flux re-exports all of the functions exported by the [NNlib](https://github.com/
Non-linearities that go between layers of your model. Note that, unless otherwise stated, activation functions operate on scalars. To apply them to an array you can call `σ.(xs)`, `relu.(xs)` and so on.
```@docs
NNlib.celu
NNlib.elu
NNlib.gelu
NNlib.hardsigmoid
NNlib.hardtanh
NNlib.leakyrelu
NNlib.lisht
NNlib.logcosh
NNlib.logsigmoid
NNlib.mish
NNlib.relu
NNlib.relu6
NNlib.rrelu
NNlib.selu
NNlib.sigmoid
NNlib.softplus
NNlib.softshrink
NNlib.softsign
NNlib.swish
NNlib.tanhshrink
NNlib.trelu
```
## Softmax
@ -48,4 +58,4 @@ NNlib.batched_mul
NNlib.batched_mul!
NNlib.batched_adjoint
NNlib.batched_transpose
```
```