Merge pull request #772 from johnnychen94/patch-1
delete redundant section
This commit is contained in:
commit
5931b93e09
@ -38,17 +38,6 @@ But in contrast to the layers described in the other sections are not readily gr
|
|||||||
Maxout
|
Maxout
|
||||||
```
|
```
|
||||||
|
|
||||||
# Normalisation & Regularisation
|
|
||||||
|
|
||||||
These layers don't affect the structure of the network but may improve training times or reduce overfitting.
|
|
||||||
|
|
||||||
```@docs
|
|
||||||
Flux.testmode!
|
|
||||||
BatchNorm
|
|
||||||
Dropout
|
|
||||||
LayerNorm
|
|
||||||
```
|
|
||||||
|
|
||||||
## Activation Functions
|
## Activation Functions
|
||||||
|
|
||||||
Non-linearities that go between layers of your model. Most of these functions are defined in [NNlib](https://github.com/FluxML/NNlib.jl) but are available by default in Flux.
|
Non-linearities that go between layers of your model. Most of these functions are defined in [NNlib](https://github.com/FluxML/NNlib.jl) but are available by default in Flux.
|
||||||
|
Loading…
Reference in New Issue
Block a user