Flux.jl/src
2017-06-19 16:48:52 -07:00
..
backend overload correct repeated 2017-06-19 16:48:52 -07:00
Batches more robust batches 2017-06-09 18:57:18 +01:00
compiler unroll multiple inputs 2017-06-17 19:21:39 -07:00
layers collect exports 2017-06-05 16:37:06 +01:00
core.jl split out core 2017-06-05 17:16:28 +01:00
data.jl these are probably the wrong abstraction 2017-06-06 18:03:36 +01:00
Flux.jl fix juno import 2017-06-05 17:31:43 +01:00
params.jl split out core 2017-06-05 17:16:28 +01:00
readme.md tweak descriptions 2017-06-05 17:16:38 +01:00
training.jl more general seq/tuple support 2017-06-10 08:33:17 +01:00
utils.jl give up and use AbstractArray 2017-06-05 16:09:06 +01:00

core.jl implements the core Flux API for forward and backward passes.

The compiler folder implements the @net macro and some dataflow manipulation (like loop unrolling). The mxnet and tensorflow folders in backend each describe how to run @net code on those backends. These are the most involved parts of Flux where the magic-y stuff happens, but everything else is pretty straightforward Julia code.

layers is Flux's "standard library" of model parts; layers, activations, cost functions. Most are implemented using @net. control.jl implements Chain and others in pure Julia. shims.jl includes some things like convolutions which don't have "real" implementations yet, but can compile to a backend regardless.

Batches implements the mechanisms for typing batch (and other) dimensions. This is basically standalone.

data.jl and utils.jl implement misc utilities.