Relax! Flux is the ML library that doesn't make you tensor
Go to file
Mike J Innes 4d6296f311 use stable docs 2017-05-02 13:42:44 +01:00
docs docs notes 2017-05-02 13:38:37 +01:00
examples use fancy new callback api 2017-05-01 14:23:48 +01:00
src basic back pass 2017-05-01 19:44:58 +01:00
test use params instead of vars 2017-05-01 18:27:52 +01:00
.gitignore ignore demos 2017-03-14 15:27:53 +00:00
.travis.yml test on 0.6 now 2017-03-14 15:20:56 +00:00
LICENSE.md Flux.jl generated files. 2016-03-22 19:58:58 +00:00
README.md use stable docs 2017-05-02 13:42:44 +01:00
REQUIRE juno req 2017-03-01 13:20:39 +00:00

README.md

Флукс

Build Status Join the chat at https://gitter.im/MikeInnes/Flux.jl

Flux is a high-level library for machine learning, implemented in Julia.

Flux is designed to get the best performance (by running on TensorFlow or MXNet) while still being intuitive to work with you get good error messages, can step through models with the debugger, and the notation is very close to what you'd find in a paper.

Check out the docs to get started. Flux is in alpha so please open issues liberally; if something is broken for you it can most likely be fixed easily, or if you're not sure how to do something we can help.

Brief Examples

Simple multi-layer-perceptron for MNIST:

Chain(
  Input(784),
  Affine(128), relu,
  Affine( 64), relu,
  Affine( 10), softmax)

LSTM example:

@net type LSTM
  Wxf; Wyf; bf
  Wxi; Wyi; bi
  Wxo; Wyo; bo
  Wxc; Wyc; bc
  y; state
  function (x)
    # Gates
    forget = σ( x * Wxf + y{-1} * Wyf + bf )
    input  = σ( x * Wxi + y{-1} * Wyi + bi )
    output = σ( x * Wxo + y{-1} * Wyo + bo )
    # State update and output
    state = tanh( x * Wxc + y{-1} * Wyc + bc )
    state  = forget .* state{-1} + input .* state
    y = output .* tanh(state)
  end
end

Chain(
  Input(N),
  LSTM(N, 256),
  LSTM(256, 256),
  Affine(256, N),
  softmax)