Flux.jl/src
2017-07-08 11:59:17 +02:00
..
backend add solve 2017-07-08 11:59:17 +02:00
Batches cleaner chunk util 2017-06-19 16:49:17 -07:00
compiler show methods 2017-06-19 16:49:29 -07:00
layers simplify back! of Chain 2017-06-14 21:58:43 +08:00
core.jl remove confusing comments 2017-07-03 20:47:41 +01:00
data.jl cleaner chunk util 2017-06-19 16:49:17 -07:00
Flux.jl add reshape 2017-07-08 11:58:31 +02:00
ops.jl add solve 2017-07-08 11:59:17 +02:00
params.jl split out core 2017-06-05 17:16:28 +01:00
readme.md tweak descriptions 2017-06-05 17:16:38 +01:00
training.jl more general seq/tuple support 2017-06-10 08:33:17 +01:00
utils.jl give up and use AbstractArray 2017-06-05 16:09:06 +01:00

core.jl implements the core Flux API for forward and backward passes.

The compiler folder implements the @net macro and some dataflow manipulation (like loop unrolling). The mxnet and tensorflow folders in backend each describe how to run @net code on those backends. These are the most involved parts of Flux where the magic-y stuff happens, but everything else is pretty straightforward Julia code.

layers is Flux's "standard library" of model parts; layers, activations, cost functions. Most are implemented using @net. control.jl implements Chain and others in pure Julia. shims.jl includes some things like convolutions which don't have "real" implementations yet, but can compile to a backend regardless.

Batches implements the mechanisms for typing batch (and other) dimensions. This is basically standalone.

data.jl and utils.jl implement misc utilities.