Flux.jl/src
2017-06-08 11:59:59 +02:00
..
backend Merge remote-tracking branch 'upstream/master' into add-more-tf-ops 2017-06-08 11:59:59 +02:00
Batches seq rpad 2017-06-06 17:15:29 +01:00
compiler generic and consistent conversions 2017-06-05 22:49:31 +01:00
layers collect exports 2017-06-05 16:37:06 +01:00
core.jl split out core 2017-06-05 17:16:28 +01:00
data.jl these are probably the wrong abstraction 2017-06-06 18:03:36 +01:00
Flux.jl fix juno import 2017-06-05 17:31:43 +01:00
params.jl split out core 2017-06-05 17:16:28 +01:00
readme.md tweak descriptions 2017-06-05 17:16:38 +01:00
training.jl fix 2017-06-05 22:51:08 +01:00
utils.jl give up and use AbstractArray 2017-06-05 16:09:06 +01:00

core.jl implements the core Flux API for forward and backward passes.

The compiler folder implements the @net macro and some dataflow manipulation (like loop unrolling). The mxnet and tensorflow folders in backend each describe how to run @net code on those backends. These are the most involved parts of Flux where the magic-y stuff happens, but everything else is pretty straightforward Julia code.

layers is Flux's "standard library" of model parts; layers, activations, cost functions. Most are implemented using @net. control.jl implements Chain and others in pure Julia. shims.jl includes some things like convolutions which don't have "real" implementations yet, but can compile to a backend regardless.

Batches implements the mechanisms for typing batch (and other) dimensions. This is basically standalone.

data.jl and utils.jl implement misc utilities.