This commit is contained in:
Lyndon White 2019-02-28 11:35:28 +00:00
parent 838047f708
commit c76b9c7e2c

View File

@ -30,13 +30,25 @@ GRU
Flux.Recur Flux.Recur
``` ```
## Esoteric Layers ## Other General Purpose Layers
These are marginally more obscure layers. These are marginally more obscure than the Basic Layers.
But incontrast to the layers described in the other sections are not readily grouped around a paparticular purpose (e.g. CNNs or RNNs).
```@docs ```@docs
MaxOut MaxOut
``` ```
# Normalisation & Regularisation
These layers don't affect the structure of the network but may improve training times or reduce overfitting.
```@docs
Flux.testmode!
BatchNorm
Dropout
LayerNorm
```
## Activation Functions ## Activation Functions
Non-linearities that go between layers of your model. Most of these functions are defined in [NNlib](https://github.com/FluxML/NNlib.jl) but are available by default in Flux. Non-linearities that go between layers of your model. Most of these functions are defined in [NNlib](https://github.com/FluxML/NNlib.jl) but are available by default in Flux.