![]() 1192: Improve `restructure` performance r=dhairyagandhi96 a=MikeInnes A small change, but it significantly improves the performance on the following test case: ```julia julia> VERSION v"1.5.0-DEV.876" julia> using Flux, DiffEqFlux, BenchmarkTools julia> using Flux: mse julia> fastdense = FastDense(784, 32, tanh); julia> p = initial_params(fastdense); julia> dense = Dense(784, 32, tanh); julia> p,re = Flux.destructure(dense); julia> x = rand(Float32, 784, 10); julia> y = rand(Float32, 32, 10); julia> @btime gradient((x,p) -> mse(fastdense(x, p), y), x, p); 505.530 μs (87 allocations: 240.73 KiB) julia> @btime gradient((x,p) -> mse(re(p)(x), y), x, p); 107.796 μs (139 allocations: 340.94 KiB) ``` Co-authored-by: Mike J Innes <mike.j.innes@gmail.com> |
||
---|---|---|
.github | ||
docs | ||
paper | ||
src | ||
test | ||
.gitattributes | ||
.gitignore | ||
.gitlab-ci.yml | ||
.travis.yml | ||
CITATION.bib | ||
LICENSE.md | ||
Manifest.toml | ||
NEWS.md | ||
Project.toml | ||
README.md | ||
bors.toml |
README.md
Flux is an elegant approach to machine learning. It's a 100% pure-Julia stack, and provides lightweight abstractions on top of Julia's native GPU and AD support. Flux makes the easy things easy while remaining fully hackable.
] add Flux
See the documentation or the model zoo for examples.
If you use Flux in your research, please cite our work.