Updated activation functions in NNlib doc
This commit is contained in:
parent
4acc907723
commit
d0e8a9ff71
|
@ -7,17 +7,27 @@ Flux re-exports all of the functions exported by the [NNlib](https://github.com/
|
||||||
Non-linearities that go between layers of your model. Note that, unless otherwise stated, activation functions operate on scalars. To apply them to an array you can call `σ.(xs)`, `relu.(xs)` and so on.
|
Non-linearities that go between layers of your model. Note that, unless otherwise stated, activation functions operate on scalars. To apply them to an array you can call `σ.(xs)`, `relu.(xs)` and so on.
|
||||||
|
|
||||||
```@docs
|
```@docs
|
||||||
|
NNlib.celu
|
||||||
NNlib.elu
|
NNlib.elu
|
||||||
NNlib.gelu
|
NNlib.gelu
|
||||||
|
NNlib.hardsigmoid
|
||||||
|
NNlib.hardtanh
|
||||||
NNlib.leakyrelu
|
NNlib.leakyrelu
|
||||||
|
NNlib.lisht
|
||||||
NNlib.logcosh
|
NNlib.logcosh
|
||||||
NNlib.logsigmoid
|
NNlib.logsigmoid
|
||||||
|
NNlib.mish
|
||||||
NNlib.relu
|
NNlib.relu
|
||||||
|
NNlib.relu6
|
||||||
|
NNlib.rrelu
|
||||||
NNlib.selu
|
NNlib.selu
|
||||||
NNlib.sigmoid
|
NNlib.sigmoid
|
||||||
NNlib.softplus
|
NNlib.softplus
|
||||||
|
NNlib.softshrink
|
||||||
NNlib.softsign
|
NNlib.softsign
|
||||||
NNlib.swish
|
NNlib.swish
|
||||||
|
NNlib.tanhshrink
|
||||||
|
NNlib.trelu
|
||||||
```
|
```
|
||||||
|
|
||||||
## Softmax
|
## Softmax
|
||||||
|
@ -48,4 +58,4 @@ NNlib.batched_mul
|
||||||
NNlib.batched_mul!
|
NNlib.batched_mul!
|
||||||
NNlib.batched_adjoint
|
NNlib.batched_adjoint
|
||||||
NNlib.batched_transpose
|
NNlib.batched_transpose
|
||||||
```
|
```
|
||||||
|
|
Loading…
Reference in New Issue