Relax! Flux is the ML library that doesn't make you tensor
Go to file
Bruno Hebling Vieira e7d76b8423 Added the SkipConnection layer and constructor
Added missing export

Corrected channel placement

Dimension 4 cannot be assumed to always be the Channel dimension

Deprecation of `treelike`

Code now makes use of `@treelike` macro instead of the deprecated `treelike` function (it worked on my end because I'm on Julia 0.7, while Julia 1.0 deprecated stuff)

Update basic.jl

Renaming to SkipConnection

* Update Flux.jl

* Update basic.jl

Updated `SkipConnection` with a `connection` field

I'm pretty sure I broke something now, but this PR should follow along these lines `cat` needs special treatment (the user can declare his own `concatenate` connection, but I foresee it's going to be used often so we can simply define special treatment)

Forgot to remove some rebasing text

Forgot to remove some more rebasing text

Removed local copy and default cat method from the function calls

Adjusted some more types for inference, could improve on this as well

Re-placed some left-over spaces
2019-05-13 16:32:00 -03:00
docs delete redundant section 2019-05-11 12:40:01 +08:00
paper use https instead of http for web links 2019-04-25 11:04:03 +00:00
src Added the SkipConnection layer and constructor 2019-05-13 16:32:00 -03:00
test Merge #563 2019-05-13 17:16:10 +00:00
.gitattributes ignore latex 2018-02-21 22:17:15 +00:00
.gitignore modernize documentation 2019-01-10 15:06:11 +01:00
.gitlab-ci.yml fix variable name 2019-04-03 16:01:27 +05:30
.travis.yml test against manifest 2019-01-29 09:07:46 +00:00
bors.toml increase bors timeout 2019-04-01 20:10:08 +05:30
CITATION.bib Create CITATION.bib 2019-05-04 18:49:19 -04:00
LICENSE.md Update LICENSE.md 2019-04-15 16:59:16 -04:00
Manifest.toml Update Project/Manifest 2019-04-25 10:11:41 -07:00
NEWS.md Merge pull request #774 from zsz00/patch-1 2019-05-14 00:37:17 +05:30
Project.toml bump version to v0.8.3 2019-04-29 22:01:46 +05:30
README.md use https instead of http for web links 2019-04-25 11:04:03 +00:00
REQUIRE add Tracker to REQUIRE 2019-03-22 23:35:29 +05:30

Build Status DOI

Flux is an elegant approach to machine learning. It's a 100% pure-Julia stack, and provides lightweight abstractions on top of Julia's native GPU and AD support. Flux makes the easy things easy while remaining fully hackable.

julia> Pkg.add("Flux")

See the documentation or the model zoo for examples.

If you use Flux in research, please cite the following paper:

@article{innes:2018,
  author    = {Mike Innes},
  title     = {Flux: Elegant Machine Learning with Julia},
  journal   = {Journal of Open Source Software},
  year      = {2018},
  doi       = {10.21105/joss.00602},
}

Features

Flux has powerful high-level features, and common architectures can be defined in a few lines.

model = Chain(
  Dense(768, 128, σ),
  LSTM(128, 256),
  LSTM(256, 128),
  Dense(128, 10),
  softmax)

loss(x, y) = crossentropy(model(x), y)

Flux.train!(loss, data, ADAM(...))

Yet you can easily strip away the layers, and directly write the mathematics for your problem. Flux will seamlessly take gradients of any Julia code, so your model looks just like the paper.

W = param(randn(2, 10))
b = param(randn(2))

y(x) = σ.(W * x .+ b)

If that's still not enough, you can go as deep as you want, even writing your own CUDA kernels with CUDAnative! All this can be freely mixed-and-matched in a single model or script, and it all runs interactively via Jupyter or Juno.

function gpu_add(a, b, c)
  i = (blockIdx().x-1) * blockDim().x + threadIdx().x
  c[i] = a[i] + b[i]
  return nothing
end

Unusual architectures are no problem in Flux, as you can use all the loops, control flow and even macros that you're used to. Here's a Tree RNN in 4 lines.

tree() = rand() < 0.5 ? rand(10) : (tree(), tree()) # dummy data

shrink = Dense(20, 10)
combine(a, b) = shrink([a; b])

model(x) = x
model(x::Tuple) = combine(model(x[1]), model(x[2]))

model(tree()) # Sample output

Despite this flexibility, Julia's advanced compiler lets us do some powerful optimisations. For example, this definition of sigmoid automatically gets fused into a single GPU kernel so it's really fast.

sigmoid(xs) = 1 ./ (1 .+ exp.(.-xs))

Similarly, Flux is the first dynamic framework to support compiling to the browser and model import via formats like ONNX, both of which are thinly-veiled compiler problems.

For more on our philosophy on machine learning, check out our article On Machine Learning & Programming Languages.

Contributing & Help

For general questions and help, check out Julia's community forum.

Flux development is carried out via our GitHub issues, so feel free to open feature requests or PRs here.

For more informal discussions we'd love to have you on the Julia slack, where we hang out on the #machine-learning channel.

Check out Metalhead.jl for common computer vision datasets and trained models.

MLDatasets.jl provides further common datasets.