GPU Support
Support for array operations on other hardware backends, like GPUs, is provided by external packages like CuArrays. Flux is agnostic to array types, so we simply need to move model weights and data to the GPU and Flux will handle it.
For example, we can use CuArrays
(with the cu
converter) to run our basic example on an NVIDIA GPU.
using CuArrays
+
GPU Support
Support for array operations on other hardware backends, like GPUs, is provided by external packages like CuArrays. Flux is agnostic to array types, so we simply need to move model weights and data to the GPU and Flux will handle it.
For example, we can use CuArrays
(with the cu
converter) to run our basic example on an NVIDIA GPU.
(Note that you need to build Julia 0.6 from source and have CUDA available to use CuArrays – please see the CUDAnative.jl instructions for more details.)
using CuArrays
W = cu(rand(2, 5)) # a 2×5 CuArray
b = cu(rand(2))
diff --git a/latest/index.html b/latest/index.html
index bb6733e3..51ceb6d6 100644
--- a/latest/index.html
+++ b/latest/index.html
@@ -9,4 +9,4 @@ ga('send', 'pageview');
Flux: The Julia Machine Learning Library
Flux is a library for machine learning. It comes "batteries-included" with many useful tools built in, but also lets you use the full power of the Julia language where you need it. The whole stack is implemented in clean Julia code (right down to the GPU kernels) and any part can be tweaked to your liking.
Installation
Install Julia 0.6.0 or later, if you haven't already.
Pkg.add("Flux")
# Optional but recommended
Pkg.update() # Keep your packages up to date
-Pkg.test("Flux") # Check things installed correctly
Start with the basics. The model zoo is also a good starting point for many common kinds of models.