diff --git a/docs/src/models/basics.md b/docs/src/models/basics.md index 06901d99..26c7da76 100644 --- a/docs/src/models/basics.md +++ b/docs/src/models/basics.md @@ -32,8 +32,6 @@ julia> gradient(f, [2, 1], [2, 0]) But machine learning models can have *hundreds* of parameters! To handle this, Flux lets you work with collections of parameters, via `params`. You can get the gradient of all parameters used in a program without explicitly passing them in. ```jldoctest basics -julia> using Flux - julia> x = [2, 1]; julia> y = [2, 0];