Relax! Flux is the ML library that doesn't make you tensor
Go to file
Mike J Innes 329f9bbc27 Merge pull request #48 from peterjdolan/patch-1
Make Flux.onehot accessible in the MNIST example
2017-07-03 18:46:25 +01:00
docs docs updates 2017-05-04 17:16:22 +01:00
examples Make Flux.onehot accessible in the MNIST example 2017-06-23 16:57:08 -07:00
src Merge pull request #44 from ylxdzsw/train-naive 2017-07-03 18:41:39 +01:00
test add test for optimizers 2017-06-26 17:21:17 +08:00
.gitignore ignore demos 2017-03-14 15:27:53 +00:00
.travis.yml checkout dataflow for now 2017-06-01 16:57:39 +01:00
LICENSE.md Flux.jl generated files. 2016-03-22 19:58:58 +00:00
README.md update readme 2017-05-03 20:13:31 +01:00
REQUIRE remove iterators 2017-06-06 18:03:14 +01:00

Флукс

Build Status Join the chat at https://gitter.im/MikeInnes/Flux.jl

Flux is a library for machine learning, implemented in Julia.

At the core of it, Flux simply lets you run your normal Julia code on a dataflow backend like TensorFlow.

@net f(x) = x .* x
f([1,2,3]) == [1,4,9]
f_tensorflow = tf(f)
f_tensorflow([1,2,3]) == [1.0, 4.0, 9.0]

After adding the @net annotation we can take advantage of various optimisations, parallelism, and access to GPUs that TensorFlow provides. Unlike a TensorFlow graph, f continues to behave like Julia code; you still get good stack traces, can step through in the debugger, etc.

On top of this foundation we build a set of flexible machine learning abstractions and utilities that interoperate well with other approaches like Knet. This gives you great flexibility; you can go high level or stay mathematical, write custom GPU kernels, build your own abstractions, and mix and match approaches.

Check out the docs to get started. Flux is in alpha so please open issues liberally; we would love to help you get started.

Brief Examples

Simple multi-layer-perceptron for MNIST, using the high-level API:

Chain(
  Input(784),
  Affine(128), relu,
  Affine( 64), relu,
  Affine( 10), softmax)

Define a custom recurrent layer:

@net type Recurrent
  Wxy; Wyy; by
  y
  function (x)
    y = tanh( x * Wxy .+ y{-1} * Wyy .+ by )
  end
end