Flux.jl/src
2017-06-02 15:34:30 +01:00
..
backend mxnet repeated 2017-06-02 15:34:30 +01:00
compiler add note 2017-06-01 18:37:20 +01:00
dims this might be a bit dirty 2017-06-02 15:22:23 +01:00
layers fix flatten/softmax batching behaviour 2017-06-01 19:28:02 +01:00
data.jl organise training and utils 2017-05-01 12:41:54 +01:00
Flux.jl batch iterator 2017-05-11 19:03:53 +01:00
model.jl reorganise recurrent stuff 2017-05-04 10:45:44 +01:00
readme.md src docs 2017-05-04 13:17:21 +01:00
training.jl use juno's info function 2017-05-01 15:14:29 +01:00
utils.jl broadcastto 2017-06-02 15:02:30 +01:00

model.jl implements the core Flux API for forward and backward passes.

The compiler folder implements the @net macro and some dataflow manipulation (like loop unrolling). The mxnet and tensorflow folders in backend each describe how to run @net code on those backends. These are the most involved parts of Flux where the magic-y stuff happens, but everything else is pretty straightforward Julia code.

layers is Flux's "standard library" of model parts; layers, activations, cost functions. Most are implemented using @net. control.jl implements Chain and others in pure Julia. shims.jl includes some things like convolutions which don't have "real" implementations yet, but can compile to a backend regardless.

dims implements the mechanisms for typing batch (and other) dimensions. This is basically standalone.

data.jl and utils.jl implement misc utilities.