Relax! Flux is the ML library that doesn't make you tensor
Go to file
2016-05-10 17:06:31 +01:00
examples initial torch-like, MNIST demo 2016-05-10 17:06:31 +01:00
src initial torch-like, MNIST demo 2016-05-10 17:06:31 +01:00
test Flux.jl generated files. 2016-03-22 19:58:58 +00:00
.gitignore Flux.jl generated files. 2016-03-22 19:58:58 +00:00
.travis.yml Flux.jl generated files. 2016-03-22 19:58:58 +00:00
appveyor.yml Flux.jl generated files. 2016-03-22 19:58:58 +00:00
LICENSE.md Flux.jl generated files. 2016-03-22 19:58:58 +00:00
README.md tweak 2016-05-08 20:07:19 +01:00
REQUIRE init 2016-04-01 22:11:42 +01:00

Flux

Flux is an experimental machine perception / ANN library for Julia. It's most similar in philosophy to the excellent Keras. Like that and other high-level ANN libraries, Flux is designed to make experimenting with novel layer types and architectures really fast, without sacrificing speed.

Flux has a few key differences from other libraries:

  • Flux's graph-based DSL, which provides optimisations and automatic differentiation (à la Theano), is very tightly integrated with the language. This means nice syntax for your equations (σ(W*x+b) anyone?) and no unwieldy compile steps.
  • The graph DSL directly represents models, as opposed to computations, so custom architectures  and in particular, recurrent models  are easy to express.
  • Those fancy features are completely optional. You can implement arbitrary functionality in a Torch-like fashion if you wish, since layers are simply objects that satisfy a small interface (à la Torch).
  • Flux is written in Julia, which means there's no "dropping down" to C. It's Julia all the way down, and you can prototype both high-level architectures and high-performance GPU kernels from the same language. This also makes the library itself very easy to understand and extend.