nnlib docs
This commit is contained in:
parent
12944ae125
commit
190f48a709
@ -1,6 +1,6 @@
|
|||||||
using Documenter, Flux
|
using Documenter, Flux, NNlib
|
||||||
|
|
||||||
makedocs(modules=[Flux],
|
makedocs(modules=[Flux, NNlib],
|
||||||
doctest = false,
|
doctest = false,
|
||||||
format = :html,
|
format = :html,
|
||||||
analytics = "UA-36890222-9",
|
analytics = "UA-36890222-9",
|
||||||
|
@ -6,3 +6,18 @@ These core layers form the foundation of almost all neural networks.
|
|||||||
Chain
|
Chain
|
||||||
Dense
|
Dense
|
||||||
```
|
```
|
||||||
|
|
||||||
|
## Activation Functions
|
||||||
|
|
||||||
|
Non-linearities that go between layers of your model. Most of these functions are defined in [NNlib](https://github.com/FluxML/NNlib.jl) but are available by default in Flux.
|
||||||
|
|
||||||
|
Note that, unless otherwise stated, activation functions operate on scalars. To apply them to an array you can call `σ.(xs)`, `relu.(xs)` and so on.
|
||||||
|
|
||||||
|
```@docs
|
||||||
|
σ
|
||||||
|
relu
|
||||||
|
leakyrelu
|
||||||
|
elu
|
||||||
|
swish
|
||||||
|
softmax
|
||||||
|
```
|
||||||
|
Loading…
Reference in New Issue
Block a user