fix docs
This commit is contained in:
parent
838047f708
commit
c76b9c7e2c
|
@ -30,13 +30,25 @@ GRU
|
|||
Flux.Recur
|
||||
```
|
||||
|
||||
## Esoteric Layers
|
||||
These are marginally more obscure layers.
|
||||
## Other General Purpose Layers
|
||||
These are marginally more obscure than the Basic Layers.
|
||||
But incontrast to the layers described in the other sections are not readily grouped around a paparticular purpose (e.g. CNNs or RNNs).
|
||||
|
||||
```@docs
|
||||
MaxOut
|
||||
```
|
||||
|
||||
# Normalisation & Regularisation
|
||||
|
||||
These layers don't affect the structure of the network but may improve training times or reduce overfitting.
|
||||
|
||||
```@docs
|
||||
Flux.testmode!
|
||||
BatchNorm
|
||||
Dropout
|
||||
LayerNorm
|
||||
```
|
||||
|
||||
## Activation Functions
|
||||
|
||||
Non-linearities that go between layers of your model. Most of these functions are defined in [NNlib](https://github.com/FluxML/NNlib.jl) but are available by default in Flux.
|
||||
|
|
Loading…
Reference in New Issue