Flux.jl/docs/src/models/layers.md
2018-05-04 15:05:02 +01:00

892 B
Raw Blame History

Basic Layers

These core layers form the foundation of almost all neural networks.

Chain
Dense
Conv

Recurrent Layers

Much like the core layers above, but can be used to process sequence data (as well as other kinds of structured data).

RNN
LSTM
GRU
Flux.Recur

Activation Functions

Non-linearities that go between layers of your model. Most of these functions are defined in NNlib but are available by default in Flux.

Note that, unless otherwise stated, activation functions operate on scalars. To apply them to an array you can call σ.(xs), relu.(xs) and so on.

σ
relu
leakyrelu
elu
swish

Normalisation & Regularisation

These layers don't affect the structure of the network but may improve training times or reduce overfitting.

Flux.testmode!
BatchNorm
Dropout
LayerNorm