This commit is contained in:
Mike J Innes 2018-06-26 14:25:24 +01:00
parent aa8f79f10c
commit 1490d87d83
2 changed files with 2 additions and 4 deletions

View File

@ -6,8 +6,6 @@
Flux is an elegant approach to machine learning. It's a 100% pure-Julia stack, and provides lightweight abstractions on top of Julia's native GPU and AD support. Flux makes the easy things easy while remaining fully hackable.
You need to build Julia 0.6 from source and have CUDA available to use Flux with GPUs please see the [CUDAnative.jl](https://github.com/JuliaGPU/CUDAnative.jl) instructions for more details.
```julia
julia> Pkg.add("Flux")
```

View File

@ -1,11 +1,11 @@
# GPU Support
You need to build Julia 0.6 from source and have CUDA available to use Flux with GPUs please see the [CUDAnative.jl](https://github.com/JuliaGPU/CUDAnative.jl) instructions for more details.
Support for array operations on other hardware backends, like GPUs, is provided by external packages like [CuArrays](https://github.com/JuliaGPU/CuArrays.jl). Flux is agnostic to array types, so we simply need to move model weights and data to the GPU and Flux will handle it.
For example, we can use `CuArrays` (with the `cu` converter) to run our [basic example](models/basics.md) on an NVIDIA GPU.
(Note that you need to build Julia 0.6 from source and have CUDA available to use CuArrays please see the [CUDAnative.jl](https://github.com/JuliaGPU/CUDAnative.jl) instructions for more details.)
```julia
using CuArrays