From 9dbb8ffe742a393c2462a56bfdba23f0429f3e33 Mon Sep 17 00:00:00 2001 From: autodocs Date: Fri, 8 Sep 2017 22:11:50 +0000 Subject: [PATCH] build based on 366efa9 --- latest/contributing.html | 2 +- latest/index.html | 2 +- latest/models/basics.html | 4 ++-- latest/models/layers.html | 14 ++++++++++++++ latest/models/recurrence.html | 2 +- latest/search.html | 2 +- latest/search_index.js | 32 ++++++++++++++++++++++++++++++++ 7 files changed, 52 insertions(+), 6 deletions(-) create mode 100644 latest/models/layers.html diff --git a/latest/contributing.html b/latest/contributing.html index ec205dd7..32e89a4b 100644 --- a/latest/contributing.html +++ b/latest/contributing.html @@ -6,4 +6,4 @@ m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) ga('create', 'UA-36890222-9', 'auto'); ga('send', 'pageview'); -

Contributing & Help

Contributing

If you need help, please ask on the Julia forum, the slack (channel #machine-learning), or Flux's Gitter.

Right now, the best way to help out is to try out the examples and report any issues or missing features as you find them. The second best way is to help us spread the word, perhaps by starring the repo.

If you're interested in hacking on Flux, most of the code is pretty straightforward. Adding new layer definitions or cost functions is simple using the Flux DSL itself, and things like data utilities and training processes are all plain Julia code.

If you get stuck or need anything, let us know!

+

Contributing & Help

Contributing

If you need help, please ask on the Julia forum, the slack (channel #machine-learning), or Flux's Gitter.

Right now, the best way to help out is to try out the examples and report any issues or missing features as you find them. The second best way is to help us spread the word, perhaps by starring the repo.

If you're interested in hacking on Flux, most of the code is pretty straightforward. Adding new layer definitions or cost functions is simple using the Flux DSL itself, and things like data utilities and training processes are all plain Julia code.

If you get stuck or need anything, let us know!

diff --git a/latest/index.html b/latest/index.html index a002bad3..7ffad0b6 100644 --- a/latest/index.html +++ b/latest/index.html @@ -6,5 +6,5 @@ m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) ga('create', 'UA-36890222-9', 'auto'); ga('send', 'pageview'); -

Home

Flux: The Julia Machine Learning Library

Flux is a library for machine learning. It comes "batteries-included" with many useful tools built in, but also lets you use the full power of the Julia language where you need it. The whole stack is implemented in clean Julia code (right down to the GPU kernels) and any part can be tweaked to your liking.

Installation

Install Julia 0.6.0 or later, if you haven't already.

Pkg.add("Flux")
+

Home

Flux: The Julia Machine Learning Library

Flux is a library for machine learning. It comes "batteries-included" with many useful tools built in, but also lets you use the full power of the Julia language where you need it. The whole stack is implemented in clean Julia code (right down to the GPU kernels) and any part can be tweaked to your liking.

Installation

Install Julia 0.6.0 or later, if you haven't already.

Pkg.add("Flux")
 Pkg.test("Flux") # Check things installed correctly

Start with the basics. The model zoo is also a good starting point for many common kinds of models.

diff --git a/latest/models/basics.html b/latest/models/basics.html index 7b72ca18..131e3118 100644 --- a/latest/models/basics.html +++ b/latest/models/basics.html @@ -6,7 +6,7 @@ m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) ga('create', 'UA-36890222-9', 'auto'); ga('send', 'pageview'); -

Basics

Taking Gradients

Consider a simple linear regression, which tries to predict an output array y from an input x. (It's a good idea to follow this example in the Julia repl.)

W = rand(2, 5)
+

Basics

Taking Gradients

Consider a simple linear regression, which tries to predict an output array y from an input x. (It's a good idea to follow this example in the Julia repl.)

W = rand(2, 5)
 b = rand(2)
 
 predict(x) = W*x .+ b
@@ -24,7 +24,7 @@ back!(l)

loss(x, y) returns the same number, but it& W.data .-= grad(W) -loss(x, y) # ~ 2.5

The loss has decreased a little, meaning that our prediction x is closer to the target y. If we have some data we can already try training the model.

All deep learning in Flux, however complex, is a simple generalisation of this example. Of course, not all models look like this – they might have millions of parameters or complex control flow, and Flux provides ways to manage this complexity. Let's see what that looks like.

Building Layers

It's common to create more complex models than the linear regression above. For example, we might want to have two linear layers with a nonlinearity like sigmoid (σ) in between them. In the above style we could write this as:

W1 = param(rand(3, 5))
+loss(x, y) # ~ 2.5

The loss has decreased a little, meaning that our prediction x is closer to the target y. If we have some data we can already try training the model.

All deep learning in Flux, however complex, is a simple generalisation of this example. Of course, not all models look like this – they might have millions of parameters or complex control flow, and Flux provides ways to manage this complexity. Let's see what that looks like.

Building Layers

It's common to create more complex models than the linear regression above. For example, we might want to have two linear layers with a nonlinearity like sigmoid (σ) in between them. In the above style we could write this as:

W1 = param(rand(3, 5))
 b1 = param(rand(3))
 layer1(x) = W1 * x .+ b1
 
diff --git a/latest/models/layers.html b/latest/models/layers.html
new file mode 100644
index 00000000..6aeda847
--- /dev/null
+++ b/latest/models/layers.html
@@ -0,0 +1,14 @@
+
+Layers · Flux

Layers

Model Layers

Flux.ChainType.
Chain(layers...)

Chain multiple layers / functions together, so that they are called in sequence on a given input.

m = Chain(x -> x^2, x -> x+1)
+m(5) == 26
+
+m = Chain(Dense(10, 5), Dense(5, 2))
+x = rand(10)
+m(x) = m[2](m[1](x))

Chain also supports indexing and slicing, e.g. m[2] or m[1:end-1].

source
Flux.DenseType.
Dense(in::Integer, out::Integer, σ = identity)

Creates a traditional Dense layer with parameters W and b.

y = σ.(W * x .+ b)
source
diff --git a/latest/models/recurrence.html b/latest/models/recurrence.html index fb3e3a68..db9e9466 100644 --- a/latest/models/recurrence.html +++ b/latest/models/recurrence.html @@ -6,4 +6,4 @@ m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) ga('create', 'UA-36890222-9', 'auto'); ga('send', 'pageview'); - + diff --git a/latest/search.html b/latest/search.html index 901d1ecd..01197be8 100644 --- a/latest/search.html +++ b/latest/search.html @@ -6,4 +6,4 @@ m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) ga('create', 'UA-36890222-9', 'auto'); ga('send', 'pageview'); -

Search

Search

Number of results: loading...

    +

    Search

    Search

    Number of results: loading...

      diff --git a/latest/search_index.js b/latest/search_index.js index 85846193..ae68bd02 100644 --- a/latest/search_index.js +++ b/latest/search_index.js @@ -64,6 +64,38 @@ var documenterSearchIndex = {"docs": [ "text": "" }, +{ + "location": "models/layers.html#", + "page": "Layers", + "title": "Layers", + "category": "page", + "text": "" +}, + +{ + "location": "models/layers.html#Flux.Chain", + "page": "Layers", + "title": "Flux.Chain", + "category": "Type", + "text": "Chain(layers...)\n\nChain multiple layers / functions together, so that they are called in sequence on a given input.\n\nm = Chain(x -> x^2, x -> x+1)\nm(5) == 26\n\nm = Chain(Dense(10, 5), Dense(5, 2))\nx = rand(10)\nm(x) = m[2](m[1](x))\n\nChain also supports indexing and slicing, e.g. m[2] or m[1:end-1].\n\n\n\n" +}, + +{ + "location": "models/layers.html#Flux.Dense", + "page": "Layers", + "title": "Flux.Dense", + "category": "Type", + "text": "Dense(in::Integer, out::Integer, σ = identity)\n\nCreates a traditional Dense layer with parameters W and b.\n\ny = σ.(W * x .+ b)\n\n\n\n" +}, + +{ + "location": "models/layers.html#Model-Layers-1", + "page": "Layers", + "title": "Model Layers", + "category": "section", + "text": "Chain\nDense" +}, + { "location": "contributing.html#", "page": "Contributing & Help",