Flux.jl/docs/src/models/layers.md

88 lines
1.9 KiB
Markdown
Raw Normal View History

2017-10-23 12:07:07 +00:00
## Basic Layers
2017-09-08 21:52:41 +00:00
2017-10-18 11:31:06 +00:00
These core layers form the foundation of almost all neural networks.
2017-09-08 21:52:41 +00:00
```@docs
Chain
Dense
```
2017-10-18 10:09:33 +00:00
2019-02-27 12:20:44 +00:00
## Convolution and Pooling Layers
These layers are used to build convolutional neural networks (CNNs).
2018-06-09 08:49:47 +00:00
```@docs
2019-02-27 12:20:44 +00:00
Conv
MaxPool
MeanPool
2018-06-09 08:49:47 +00:00
DepthwiseConv
2019-02-06 15:41:41 +00:00
ConvTranspose
2019-05-14 09:50:18 +00:00
CrossCor
2018-06-09 08:49:47 +00:00
```
2017-10-23 12:07:07 +00:00
## Recurrent Layers
2017-10-18 14:30:05 +00:00
Much like the core layers above, but can be used to process sequence data (as well as other kinds of structured data).
```@docs
RNN
LSTM
2018-04-18 09:36:10 +00:00
GRU
2017-10-18 14:44:06 +00:00
Flux.Recur
2017-10-18 14:30:05 +00:00
```
2019-02-28 11:35:28 +00:00
## Other General Purpose Layers
These are marginally more obscure than the Basic Layers.
But in contrast to the layers described in the other sections are not readily grouped around a particular purpose (e.g. CNNs or RNNs).
2019-02-27 12:20:44 +00:00
```@docs
Maxout
SkipConnection
2019-02-27 12:20:44 +00:00
```
2017-10-18 10:09:33 +00:00
## Activation Functions
Non-linearities that go between layers of your model. Most of these functions are defined in [NNlib](https://github.com/FluxML/NNlib.jl) but are available by default in Flux.
Note that, unless otherwise stated, activation functions operate on scalars. To apply them to an array you can call `σ.(xs)`, `relu.(xs)` and so on.
```@docs
σ
relu
leakyrelu
elu
swish
```
2017-10-26 10:46:12 +00:00
## Normalisation & Regularisation
These layers don't affect the structure of the network but may improve training times or reduce overfitting.
```@docs
2017-10-30 05:33:01 +00:00
BatchNorm
2017-10-26 10:46:12 +00:00
Dropout
2019-03-03 19:40:12 +00:00
AlphaDropout
2017-10-23 11:53:07 +00:00
LayerNorm
2019-04-05 17:46:46 +00:00
GroupNorm
2017-10-26 10:46:12 +00:00
```
2019-09-30 15:32:13 +00:00
2020-02-25 19:53:49 +00:00
### Testmode
Many normalisation layers behave differently under training and inference (testing). By default, Flux will automatically determine when a layer evaluation is part of training or inference. Still, depending on your use case, it may be helpful to manually specify when these layers should be treated as being trained or not. For this, Flux provides `testmode!`. When called on a model (e.g. a layer or chain of layers), this function will place the model into the mode specified.
```@docs
testmode!
```
2019-12-09 15:09:46 +00:00
## Cost Functions
2019-09-30 15:32:13 +00:00
```@docs
mse
crossentropy
logitcrossentropy
binarycrossentropy
logitbinarycrossentropy
kldivergence
poisson
hinge
2019-12-09 15:09:46 +00:00
```