Flux.jl/README.md

45 lines
1.8 KiB
Markdown
Raw Normal View History

2016-04-01 21:11:42 +00:00
# Флукс
2017-08-29 02:13:33 +00:00
[![Build Status](https://travis-ci.org/FluxML/Flux.jl.svg?branch=master)](https://travis-ci.org/FluxML/Flux.jl) [![](https://img.shields.io/badge/docs-stable-blue.svg)](https://fluxml.github.io/Flux.jl/stable/) [![Join the chat at https://gitter.im/FluxML](https://badges.gitter.im/FluxML/Lobby.svg)](https://gitter.im/FluxML/Lobby) [Slack](https://discourse.julialang.org/t/announcing-a-julia-slack/4866)
2016-12-15 21:08:43 +00:00
2017-05-03 19:13:31 +00:00
Flux is a library for machine learning, implemented in Julia.
2016-04-01 21:11:42 +00:00
2017-05-03 19:13:31 +00:00
At the core of it, Flux simply lets you run your normal Julia code on a dataflow backend like TensorFlow.
2016-09-02 13:18:46 +00:00
2017-05-03 19:13:31 +00:00
```julia
@net f(x) = x .* x
f([1,2,3]) == [1,4,9]
f_tensorflow = tf(f)
f_tensorflow([1,2,3]) == [1.0, 4.0, 9.0]
```
After adding the `@net` annotation we can take advantage of various optimisations, parallelism, and access to GPUs that TensorFlow provides. Unlike a TensorFlow graph, `f` continues to behave like Julia code; you still get good stack traces, can step through in the debugger, etc.
On top of this foundation we build a set of flexible machine learning abstractions and utilities that interoperate well with other approaches like [Knet](https://github.com/denizyuret/Knet.jl). This gives you great flexibility; you can go high level or stay mathematical, write custom GPU kernels, build your own abstractions, and mix and match approaches.
2017-08-24 21:10:25 +00:00
Check out the [docs](https://fluxml.github.io/Flux.jl/stable/) to get started. Flux is in alpha so **please open issues liberally**; we would love to help you get started.
2016-09-02 13:18:46 +00:00
2017-01-15 23:52:37 +00:00
## Brief Examples
2016-09-02 13:18:46 +00:00
2017-05-03 19:13:31 +00:00
Simple multi-layer-perceptron for MNIST, using the high-level API:
2016-09-02 13:18:46 +00:00
```julia
2017-01-15 23:52:37 +00:00
Chain(
2016-09-02 13:18:46 +00:00
Input(784),
2016-11-14 22:16:00 +00:00
Affine(128), relu,
Affine( 64), relu,
Affine( 10), softmax)
2016-09-02 13:18:46 +00:00
```
2017-05-03 19:13:31 +00:00
Define a custom recurrent layer:
2016-09-02 13:18:46 +00:00
```julia
2017-05-03 19:13:31 +00:00
@net type Recurrent
Wxy; Wyy; by
y
2016-09-02 13:18:46 +00:00
function (x)
2017-05-03 19:13:31 +00:00
y = tanh( x * Wxy .+ y{-1} * Wyy .+ by )
2016-09-02 13:18:46 +00:00
end
end
```