This commit is contained in:
Lyndon White 2019-02-28 11:35:28 +00:00
parent 838047f708
commit c76b9c7e2c
1 changed files with 14 additions and 2 deletions

View File

@ -30,13 +30,25 @@ GRU
Flux.Recur
```
## Esoteric Layers
These are marginally more obscure layers.
## Other General Purpose Layers
These are marginally more obscure than the Basic Layers.
But incontrast to the layers described in the other sections are not readily grouped around a paparticular purpose (e.g. CNNs or RNNs).
```@docs
MaxOut
```
# Normalisation & Regularisation
These layers don't affect the structure of the network but may improve training times or reduce overfitting.
```@docs
Flux.testmode!
BatchNorm
Dropout
LayerNorm
```
## Activation Functions
Non-linearities that go between layers of your model. Most of these functions are defined in [NNlib](https://github.com/FluxML/NNlib.jl) but are available by default in Flux.