Flux.jl/docs/src/models/layers.md
2020-03-01 12:32:15 -06:00

1.6 KiB

Basic Layers

These core layers form the foundation of almost all neural networks.

Chain
Dense

Convolution and Pooling Layers

These layers are used to build convolutional neural networks (CNNs).

Conv
MaxPool
MeanPool
DepthwiseConv
ConvTranspose
CrossCor

Recurrent Layers

Much like the core layers above, but can be used to process sequence data (as well as other kinds of structured data).

RNN
LSTM
GRU
Flux.Recur

Other General Purpose Layers

These are marginally more obscure than the Basic Layers. But in contrast to the layers described in the other sections are not readily grouped around a particular purpose (e.g. CNNs or RNNs).

Maxout
SkipConnection

Normalisation & Regularisation

These layers don't affect the structure of the network but may improve training times or reduce overfitting.

BatchNorm
Dropout
Flux.dropout
AlphaDropout
LayerNorm
GroupNorm

Testmode

Many normalisation layers behave differently under training and inference (testing). By default, Flux will automatically determine when a layer evaluation is part of training or inference. Still, depending on your use case, it may be helpful to manually specify when these layers should be treated as being trained or not. For this, Flux provides testmode!. When called on a model (e.g. a layer or chain of layers), this function will place the model into the mode specified.

testmode!
trainmode!

Cost Functions

Flux.mse
Flux.crossentropy
Flux.logitcrossentropy
Flux.binarycrossentropy
Flux.logitbinarycrossentropy
Flux.kldivergence
Flux.poisson
Flux.hinge