added training api changes

This commit is contained in:
Dhairya Gandhi 2018-12-01 16:59:27 +05:30
parent 1ea8c5a293
commit d412845192
1 changed files with 4 additions and 3 deletions

View File

@ -24,9 +24,10 @@ m = Chain(
Dense(32, 10), softmax)
loss(x, y) = Flux.mse(m(x), y)
ps = Flux.params(m)
# later
Flux.train!(loss, params, data, opt)
Flux.train!(loss, ps, data, opt)
```
The objective will almost always be defined in terms of some *cost function* that measures the distance of the prediction `m(x)` from the target `y`. Flux has several of these built in, like `mse` for mean squared error or `crossentropy` for cross entropy loss, but you can calculate it however you want.
@ -78,7 +79,7 @@ julia> @epochs 2 Flux.train!(...)
`train!` takes an additional argument, `cb`, that's used for callbacks so that you can observe the training process. For example:
```julia
train!(objective, params, data, opt, cb = () -> println("training"))
train!(objective, ps, data, opt, cb = () -> println("training"))
```
Callbacks are called for every batch of training data. You can slow this down using `Flux.throttle(f, timeout)` which prevents `f` from being called more than once every `timeout` seconds.
@ -89,6 +90,6 @@ A more typical callback might look like this:
test_x, test_y = # ... create single batch of test data ...
evalcb() = @show(loss(test_x, test_y))
Flux.train!(objective, data, opt,
Flux.train!(objective, ps, data, opt,
cb = throttle(evalcb, 5))
```