nnlib docs
This commit is contained in:
parent
12944ae125
commit
190f48a709
|
@ -1,6 +1,6 @@
|
|||
using Documenter, Flux
|
||||
using Documenter, Flux, NNlib
|
||||
|
||||
makedocs(modules=[Flux],
|
||||
makedocs(modules=[Flux, NNlib],
|
||||
doctest = false,
|
||||
format = :html,
|
||||
analytics = "UA-36890222-9",
|
||||
|
|
|
@ -6,3 +6,18 @@ These core layers form the foundation of almost all neural networks.
|
|||
Chain
|
||||
Dense
|
||||
```
|
||||
|
||||
## Activation Functions
|
||||
|
||||
Non-linearities that go between layers of your model. Most of these functions are defined in [NNlib](https://github.com/FluxML/NNlib.jl) but are available by default in Flux.
|
||||
|
||||
Note that, unless otherwise stated, activation functions operate on scalars. To apply them to an array you can call `σ.(xs)`, `relu.(xs)` and so on.
|
||||
|
||||
```@docs
|
||||
σ
|
||||
relu
|
||||
leakyrelu
|
||||
elu
|
||||
swish
|
||||
softmax
|
||||
```
|
||||
|
|
|
@ -50,8 +50,8 @@ as an `in × N` matrix. The out `y` will be a vector or batch of length `out`.
|
|||
|
||||
julia> d(rand(5))
|
||||
Tracked 2-element Array{Float64,1}:
|
||||
0.00257447
|
||||
-0.00449443
|
||||
0.00257447
|
||||
-0.00449443
|
||||
"""
|
||||
struct Dense{F,S,T}
|
||||
σ::F
|
||||
|
|
Loading…
Reference in New Issue