Relax! Flux is the ML library that doesn't make you tensor
Go to file
Tony Kelman ee6f1d96e3 don't clone master of DataFlow on travis
users will generally be on release versions
2017-03-01 06:39:09 -08:00
docs add test step 2017-03-01 14:30:41 +00:00
examples better alternative to basemodel 2017-02-28 16:42:48 +00:00
src better alternative to basemodel 2017-02-28 16:42:48 +00:00
test compile the forward pass again 2017-02-24 14:38:17 +00:00
.gitignore more structure for docs 2017-01-18 01:15:55 +00:00
.travis.yml don't clone master of DataFlow on travis 2017-03-01 06:39:09 -08:00
LICENSE.md Flux.jl generated files. 2016-03-22 19:58:58 +00:00
README.md Update README.md 2017-02-21 18:45:36 +00:00
REQUIRE juno req 2017-03-01 13:20:39 +00:00

README.md

Флукс

Build Status Join the chat at https://gitter.im/MikeInnes/Flux.jl

Flux is a high-level API for machine learning, implemented in Julia.

Flux aims to provide a concise and expressive syntax for architectures that are hard to express within other frameworks. The notation should be familiar and extremely close to what you'd find in a paper or description of the model.

The current focus is on ANNs with TensorFlow or MXNet as a backend. While it's in a very early working-prototype stage, you can see what works so far in the examples folder.

Brief Examples

Simple multi-layer-perceptron for MNIST:

Chain(
  Input(784),
  Affine(128), relu,
  Affine( 64), relu,
  Affine( 10), softmax)

LSTM example:

@net type LSTM
  Wxf; Wyf; bf
  Wxi; Wyi; bi
  Wxo; Wyo; bo
  Wxc; Wyc; bc
  y; state
  function (x)
    # Gates
    forget = σ( x * Wxf + y{-1} * Wyf + bf )
    input  = σ( x * Wxi + y{-1} * Wyi + bi )
    output = σ( x * Wxo + y{-1} * Wyo + bo )
    # State update and output
    state = tanh( x * Wxc + y{-1} * Wyc + bc )
    state  = forget .* state{-1} + input .* state
    y = output .* tanh(state)
  end
end

Chain(
  Input(N),
  LSTM(N, 256),
  LSTM(256, 256),
  Affine(256, N),
  softmax)