Merge pull request #772 from johnnychen94/patch-1

delete redundant section
This commit is contained in:
Mike J Innes 2019-05-13 17:33:01 +01:00 committed by GitHub
commit 5931b93e09
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -38,17 +38,6 @@ But in contrast to the layers described in the other sections are not readily gr
Maxout
```
# Normalisation & Regularisation
These layers don't affect the structure of the network but may improve training times or reduce overfitting.
```@docs
Flux.testmode!
BatchNorm
Dropout
LayerNorm
```
## Activation Functions
Non-linearities that go between layers of your model. Most of these functions are defined in [NNlib](https://github.com/FluxML/NNlib.jl) but are available by default in Flux.