add docsting to train!
This commit is contained in:
parent
6d3a2a2210
commit
041079237e
@ -4,10 +4,19 @@ using Flux.Tracker: back!
|
|||||||
tocb(f) = f
|
tocb(f) = f
|
||||||
tocb(fs::AbstractVector) = () -> foreach(call, fs)
|
tocb(fs::AbstractVector) = () -> foreach(call, fs)
|
||||||
|
|
||||||
|
"""
|
||||||
|
train!(loss, data, opt; cb = () -> ())
|
||||||
|
|
||||||
|
For each datapoint `d` in `data` computes the gradient of `loss(d...)` through
|
||||||
|
backpropagation and calls the optimizer `opt` and the callback `cb`
|
||||||
|
(i.e. `opt()` and `cb()`).
|
||||||
|
|
||||||
|
Multiple callbacks can be passed to `cb` as an array.
|
||||||
|
"""
|
||||||
function train!(loss, data, opt; cb = () -> ())
|
function train!(loss, data, opt; cb = () -> ())
|
||||||
cb = tocb(cb)
|
cb = tocb(cb)
|
||||||
@progress for x in data
|
@progress for d in data
|
||||||
l = loss(x...)
|
l = loss(d...)
|
||||||
isinf(l.data[]) && error("Loss is Inf")
|
isinf(l.data[]) && error("Loss is Inf")
|
||||||
isnan(l.data[]) && error("Loss is NaN")
|
isnan(l.data[]) && error("Loss is NaN")
|
||||||
back!(l)
|
back!(l)
|
||||||
|
Loading…
Reference in New Issue
Block a user