Relax! Flux is the ML library that doesn't make you tensor
Go to file
Mike J Innes e18c4506ab readme update 2016-09-06 18:10:18 +01:00
examples init 2016-09-06 18:10:18 +01:00
src remove capacitor cruft 2016-09-06 18:10:18 +01:00
test Flux.jl generated files. 2016-03-22 19:58:58 +00:00
.gitignore Flux.jl generated files. 2016-03-22 19:58:58 +00:00
.travis.yml Flux.jl generated files. 2016-03-22 19:58:58 +00:00
LICENSE.md Flux.jl generated files. 2016-03-22 19:58:58 +00:00
README.md readme update 2016-09-06 18:10:18 +01:00
REQUIRE update require 2016-09-06 18:10:18 +01:00
appveyor.yml Flux.jl generated files. 2016-03-22 19:58:58 +00:00
design.md init 2016-09-06 18:10:18 +01:00
pkg.yml jpm 2016-09-06 18:10:18 +01:00

README.md

Флукс

What?

Flux is a programming model for building neural networks, implemented in Julia.

Why?

Flux is designed to be much more intuitive than traditional frameworks. For starters, that means having a simple notation for models that's as close to the mathematical description as possible (like σ(W*x + b)). But it's deeper than syntax; we also reuse concepts from regular programming languages (like the class/object distinction) to create principled semantics. Flux is fully declarative, so there's no more mental juggling of multiple execution paths as you read imperative graph-building code.

Flux's semantics include native support for recurrent loops, which it can automatically unroll for you never do it by hand again.

But it's also designed to be extremely flexible. Flux supports multiple backends MXNet to begin with and TensorFlow in future transparently taking advantage of all their features rather than providing a lowest common denominator. Flux's design allows for custom layer types say custom GPU kernels to be implemented in pure Julia, for backends that support it.

How?

See the design docs.

Is it any good?

Yes.