Flux.jl/docs/src/models/layers.md

45 lines
890 B
Markdown
Raw Normal View History

2017-10-23 12:07:07 +00:00
## Basic Layers
2017-09-08 21:52:41 +00:00
2017-10-18 11:31:06 +00:00
These core layers form the foundation of almost all neural networks.
2017-09-08 21:52:41 +00:00
```@docs
Chain
Dense
2017-12-18 18:05:48 +00:00
Conv2D
2017-09-08 21:52:41 +00:00
```
2017-10-18 10:09:33 +00:00
2017-10-23 12:07:07 +00:00
## Recurrent Layers
2017-10-18 14:30:05 +00:00
Much like the core layers above, but can be used to process sequence data (as well as other kinds of structured data).
```@docs
RNN
LSTM
2017-10-18 14:44:06 +00:00
Flux.Recur
2017-10-18 14:30:05 +00:00
```
2017-10-18 10:09:33 +00:00
## Activation Functions
Non-linearities that go between layers of your model. Most of these functions are defined in [NNlib](https://github.com/FluxML/NNlib.jl) but are available by default in Flux.
Note that, unless otherwise stated, activation functions operate on scalars. To apply them to an array you can call `σ.(xs)`, `relu.(xs)` and so on.
```@docs
σ
relu
leakyrelu
elu
swish
```
2017-10-26 10:46:12 +00:00
## Normalisation & Regularisation
These layers don't affect the structure of the network but may improve training times or reduce overfitting.
```@docs
2017-11-21 11:29:02 +00:00
Flux.testmode!
2017-10-30 05:33:01 +00:00
BatchNorm
2017-10-26 10:46:12 +00:00
Dropout
2017-10-23 11:53:07 +00:00
LayerNorm
2017-10-26 10:46:12 +00:00
```