Flux.jl/src
2017-05-09 01:29:15 +02:00
..
backend add .^ and reduction ops for tf backend 2017-05-09 01:29:15 +02:00
compiler strip away params 2017-05-04 17:01:10 +01:00
dims reorganise recurrent stuff 2017-05-04 10:45:44 +01:00
layers reorganise recurrent stuff 2017-05-04 10:45:44 +01:00
data.jl organise training and utils 2017-05-01 12:41:54 +01:00
Flux.jl reorganise recurrent stuff 2017-05-04 10:45:44 +01:00
model.jl reorganise recurrent stuff 2017-05-04 10:45:44 +01:00
readme.md src docs 2017-05-04 13:17:21 +01:00
training.jl use juno's info function 2017-05-01 15:14:29 +01:00
utils.jl runmodel no longer needed 2017-05-04 10:32:53 +01:00

model.jl implements the core Flux API for forward and backward passes.

The compiler folder implements the @net macro and some dataflow manipulation (like loop unrolling). The mxnet and tensorflow folders in backend each describe how to run @net code on those backends. These are the most involved parts of Flux where the magic-y stuff happens, but everything else is pretty straightforward Julia code.

layers is Flux's "standard library" of model parts; layers, activations, cost functions. Most are implemented using @net. control.jl implements Chain and others in pure Julia. shims.jl includes some things like convolutions which don't have "real" implementations yet, but can compile to a backend regardless.

dims implements the mechanisms for typing batch (and other) dimensions. This is basically standalone.

data.jl and utils.jl implement misc utilities.