do things to docs
This commit is contained in:
parent
fcc3ec471a
commit
c1a33c556f
|
@ -5,14 +5,16 @@ These core layers form the foundation of almost all neural networks.
|
|||
```@docs
|
||||
Chain
|
||||
Dense
|
||||
```
|
||||
|
||||
## Convolution and Pooling Layers
|
||||
|
||||
These layers are used to build convolutional neural networks (CNNs).
|
||||
|
||||
```@docs
|
||||
Conv
|
||||
MaxPool
|
||||
MeanPool
|
||||
```
|
||||
|
||||
## Additional Convolution Layers
|
||||
|
||||
```@docs
|
||||
DepthwiseConv
|
||||
ConvTranspose
|
||||
```
|
||||
|
@ -28,6 +30,13 @@ GRU
|
|||
Flux.Recur
|
||||
```
|
||||
|
||||
## Hipster Layers
|
||||
These are marginally more obscure layers that you probably haven't heard of.
|
||||
|
||||
```@docs
|
||||
MaxOut
|
||||
```
|
||||
|
||||
## Activation Functions
|
||||
|
||||
Non-linearities that go between layers of your model. Most of these functions are defined in [NNlib](https://github.com/FluxML/NNlib.jl) but are available by default in Flux.
|
||||
|
|
Loading…
Reference in New Issue