Relax! Flux is the ML library that doesn't make you tensor
Go to file
Mike J Innes 16c34d6987 tweak intro 2017-09-07 01:29:56 -04:00
docs strip old docs 2017-09-07 01:11:43 -04:00
src handle epoch elsewhere 2017-09-07 00:29:55 -04:00
test efficient traversal 2017-09-06 23:09:32 -04:00
.gitignore ignore demos 2017-03-14 15:27:53 +00:00
.travis.yml use documenter release 2017-09-07 01:15:21 -04:00
LICENSE.md Flux.jl generated files. 2016-03-22 19:58:58 +00:00
README.md tweak intro 2017-09-07 01:29:56 -04:00
REQUIRE update dataflow bounds 2017-09-07 01:21:41 -04:00

README.md

Флукс

Build Status Join the chat at https://gitter.im/FluxML Slack

Flux is an unusually elegant machine learning library. It provides lightweight abstractions on top of Julia's native GPU and AD support, while remaining fully hackable (right down to the GPU kernels).

Define a simple model using any Julia code:

using Flux.Tracker
x, y = rand(10), rand(5) # Dummy input / output
# `track` defines parameters that we can train
W, b = track(randn(5,10)), track(randn(5))
# Transform `x` and calculate the mean squared error
loss = Flux.mse(W*x .+ b, y)
# Calculate and store gradients of `track`ed parameters
back!(loss)
Tracker.grad(W) # Get the gradient of `W` wrt the loss

Define a larger model using high-level abstractions:

using Flux

m = Chain(
  Dense(10, 32, relu),
  Dense(32, 10), softmax)

m(rand(10))

Mix and match the two:

using Flux.Tracker
x, y = rand(10), rand(5)
d = Dense(10, 5)
loss = Flux.mse(d(x), y)

See the documentation or the model zoo for more examples.