Flux.jl/docs/src/models/nnlib.md
2020-03-03 22:07:05 +05:30

935 B
Raw Blame History

NNlib

Flux re-exports all of the functions exported by the NNlib package.

Activation Functions

Non-linearities that go between layers of your model. Note that, unless otherwise stated, activation functions operate on scalars. To apply them to an array you can call σ.(xs), relu.(xs) and so on.

NNlib.celu
NNlib.elu
NNlib.gelu
NNlib.hardsigmoid
NNlib.hardtanh
NNlib.leakyrelu
NNlib.lisht
NNlib.logcosh
NNlib.logsigmoid
NNlib.mish
NNlib.relu
NNlib.relu6
NNlib.rrelu
NNlib.selu
NNlib.sigmoid
NNlib.softplus
NNlib.softshrink
NNlib.softsign
NNlib.swish
NNlib.tanhshrink
NNlib.trelu

Softmax

NNlib.softmax
NNlib.logsoftmax

Pooling

NNlib.maxpool
NNlib.meanpool

Convolution

NNlib.conv
NNlib.depthwiseconv

Batched Operations

NNlib.batched_mul
NNlib.batched_mul!
NNlib.batched_adjoint
NNlib.batched_transpose