Flux.jl/src
2017-06-01 16:55:48 +01:00
..
backend mxnet vec 2017-05-31 18:23:33 +01:00
compiler fix basic interpreters 2017-06-01 16:55:48 +01:00
dims this is gone 2017-05-30 17:47:21 +01:00
layers Fix activation functions 2017-05-25 18:12:58 -07:00
data.jl
Flux.jl batch iterator 2017-05-11 19:03:53 +01:00
model.jl reorganise recurrent stuff 2017-05-04 10:45:44 +01:00
readme.md src docs 2017-05-04 13:17:21 +01:00
training.jl
utils.jl Use a package-local squeeze function instead of extending Base 2017-05-22 04:08:46 -04:00

model.jl implements the core Flux API for forward and backward passes.

The compiler folder implements the @net macro and some dataflow manipulation (like loop unrolling). The mxnet and tensorflow folders in backend each describe how to run @net code on those backends. These are the most involved parts of Flux where the magic-y stuff happens, but everything else is pretty straightforward Julia code.

layers is Flux's "standard library" of model parts; layers, activations, cost functions. Most are implemented using @net. control.jl implements Chain and others in pure Julia. shims.jl includes some things like convolutions which don't have "real" implementations yet, but can compile to a backend regardless.

dims implements the mechanisms for typing batch (and other) dimensions. This is basically standalone.

data.jl and utils.jl implement misc utilities.