2017-10-23 12:07:07 +00:00
## Basic Layers
2017-09-08 21:52:41 +00:00
2017-10-18 11:31:06 +00:00
These core layers form the foundation of almost all neural networks.
2017-09-08 21:52:41 +00:00
```@docs
Chain
Dense
```
2017-10-18 10:09:33 +00:00
2019-02-27 12:20:44 +00:00
## Convolution and Pooling Layers
These layers are used to build convolutional neural networks (CNNs).
2018-06-09 08:49:47 +00:00
```@docs
2019-02-27 12:20:44 +00:00
Conv
MaxPool
MeanPool
2018-06-09 08:49:47 +00:00
DepthwiseConv
2019-02-06 15:41:41 +00:00
ConvTranspose
2019-05-14 09:50:18 +00:00
CrossCor
2018-06-09 08:49:47 +00:00
```
2017-10-23 12:07:07 +00:00
## Recurrent Layers
2017-10-18 14:30:05 +00:00
Much like the core layers above, but can be used to process sequence data (as well as other kinds of structured data).
```@docs
RNN
LSTM
2018-04-18 09:36:10 +00:00
GRU
2017-10-18 14:44:06 +00:00
Flux.Recur
2017-10-18 14:30:05 +00:00
```
2019-02-28 11:35:28 +00:00
## Other General Purpose Layers
These are marginally more obscure than the Basic Layers.
2019-03-07 15:44:46 +00:00
But in contrast to the layers described in the other sections are not readily grouped around a particular purpose (e.g. CNNs or RNNs).
2019-02-27 12:20:44 +00:00
```@docs
2019-03-07 11:44:13 +00:00
Maxout
2019-05-13 19:43:14 +00:00
SkipConnection
2019-02-27 12:20:44 +00:00
```
2017-10-18 10:09:33 +00:00
## Activation Functions
Non-linearities that go between layers of your model. Most of these functions are defined in [NNlib ](https://github.com/FluxML/NNlib.jl ) but are available by default in Flux.
Note that, unless otherwise stated, activation functions operate on scalars. To apply them to an array you can call `σ .(xs)` , `relu.(xs)` and so on.
```@docs
σ
relu
leakyrelu
elu
swish
```
2017-10-26 10:46:12 +00:00
## Normalisation & Regularisation
These layers don't affect the structure of the network but may improve training times or reduce overfitting.
```@docs
2017-11-21 11:29:02 +00:00
Flux.testmode!
2017-10-30 05:33:01 +00:00
BatchNorm
2017-10-26 10:46:12 +00:00
Dropout
2019-03-03 19:40:12 +00:00
AlphaDropout
2017-10-23 11:53:07 +00:00
LayerNorm
2019-04-05 17:46:46 +00:00
GroupNorm
2017-10-26 10:46:12 +00:00
```