Commit Graph

2604 Commits

Author SHA1 Message Date
Dhairya Gandhi
49ea43e711 ZeroType => Zeros 2019-10-08 20:02:04 +05:30
bors[bot]
af0dcb2c63
Merge #882
882: Check if CUDA availability changed during init. r=MikeInnes a=maleadt

With this PR, Flux checks using CUDAapi if CUDA is available during initialization, and forces recompilation if that does not agree with what was decided during precompilation. This avoids the scenario where Flux was precompiled without GPU support, consequently not allowing use of the GPU even if the user fixed his CUDA/GPU set-up because that does not force recompilation (and we can't add precompilation dependencies on stuff that doesn't exist).

However, we can't do the same for the case where we have a GPU/CUDA but CuArrays fails to import (checking if it imports during `__init__` would be much too expensive, if even possible), so this PR removes support for having CUDA/a GPU but CuArrays being broken. That's a little risky now that Flux depends on CuArrays, but the package is pretty mature and I haven't seen many bug reports failing to load it recently.

Fixes https://github.com/FluxML/Flux.jl/pull/852#issuecomment-538028314

cc @MikeInnes @xukai92

Co-authored-by: Tim Besard <tim.besard@gmail.com>
2019-10-08 13:24:49 +00:00
Dhairya Gandhi
95c5845e99 document bias switch 2019-10-08 17:54:01 +05:30
Dhairya Gandhi
b596faaffa tests bias switch 2019-10-08 17:18:39 +05:30
Dhairya Gandhi
040697fb2b add bias and weight kwarg 2019-10-08 17:18:19 +05:30
Dhairya Gandhi
f3904b4e04 add ZeroType back 2019-10-08 17:17:36 +05:30
Dhairya Gandhi
a1e826b888 fixes 2019-10-06 05:10:56 +05:30
Dhairya Gandhi
214f71f492 add N 2019-10-06 04:55:33 +05:30
Dhairya Gandhi
2ae3ad3b31 doc fixes 2019-10-06 04:46:13 +05:30
Dhairya Gandhi
d00f833c17 rm ZeroType 2019-10-06 04:44:50 +05:30
Dhairya Gandhi
e97d61f257 fixes 2019-10-06 04:42:26 +05:30
Dhairya Gandhi
48a305bd21 ditto remaining layers 2019-10-06 04:41:06 +05:30
Dhairya Gandhi
55ef7c1aba add weight and bias kwargs 2019-10-06 04:25:23 +05:30
Dhairya Gandhi
b503741651 expanded docstrings 2019-10-04 14:46:03 +05:30
Tim Besard
8aea15e6e0 Demote to const variables. 2019-10-03 21:28:55 +02:00
Tim Besard
2369b2b3fd Add an environment variable to disable CUDA usage. 2019-10-03 21:27:54 +02:00
Tim Besard
63d196aa37 Check if CUDA availability changed during init. 2019-10-03 20:05:32 +02:00
thebhatman
ec886c8ce8 Added docstring for hinge loss 2019-10-03 21:13:09 +05:30
Dhairya Gandhi
1fe321781b add to docs 2019-10-01 21:29:18 +05:30
Dhairya Gandhi
dced8c04e5 use ZeroType 2019-10-01 21:25:07 +05:30
bors[bot]
0d3aa8fa5e
Merge #877
877: Fix functor's `params!` to work with complex numbers r=MikeInnes a=PhilipVinc

I believe you forgot to define `params!` for complex-valued arrays.

If I'm wrong, feel free to close this.

Co-authored-by: Filippo Vicentini <filippovicentini@gmail.com>
2019-10-01 15:11:55 +00:00
Manjunath Bhat
2b30319a55
Merge branch 'master' into patch-6 2019-09-30 21:05:02 +05:30
thebhatman
ec35e9cbaa Loss functions docs added in layers.md 2019-09-30 21:02:13 +05:30
thebhatman
6e289ef939 Merge branch 'patch-6' of https://github.com/thebhatman/Flux.jl into patch-6 2019-09-30 20:55:44 +05:30
Filippo Vicentini
606fe58854
Use <:Number 2019-09-29 12:33:02 +02:00
Filippo Vicentini
14e94c291e
Make it actually work 2019-09-29 12:28:01 +02:00
Filippo Vicentini
d91677f651
Fix params! to work with complex numbers 2019-09-29 12:23:41 +02:00
Dhairya Gandhi
8013c728b1 clearer optimiser docstrings 2019-09-28 16:09:00 +05:30
Dhairya Gandhi
0175485a80 fixup 2019-09-27 22:08:25 +05:30
Dhairya Gandhi
8bb0db7d0c opt docstrings 2019-09-27 22:04:53 +05:30
Dhairya Gandhi
32ac71734d optimiser interface docs 2019-09-27 21:43:59 +05:30
Dhairya Gandhi
a98a1b8bb5 fixes 2019-09-27 21:43:39 +05:30
bors[bot]
e2b93bc78a
Merge #874
874: Move CUDNN wrappers to CuArrays r=MikeInnes a=MikeInnes



Co-authored-by: Tim Besard <tim.besard@gmail.com>
Co-authored-by: Mike Innes <mike.j.innes@gmail.com>
2019-09-27 14:05:37 +00:00
Mike Innes
b90b02872f Merge branch 'master' into tb/cuarrays_dnn 2019-09-27 14:58:32 +01:00
Mike Innes
e287982b78 use CuArrays master 2019-09-27 14:55:30 +01:00
Mike Innes
691a29cf32 cudnn bug is fixed 2019-09-27 14:15:58 +01:00
Dhairya Gandhi
a801fcb9e7 docstrings 2019-09-27 12:07:55 +05:30
Dhairya Gandhi
9f2ac8fdef ditto remaining conv layers 2019-09-27 12:04:27 +05:30
Dhairya Gandhi
5ea6a33f44 make bias optional 2019-09-27 11:48:12 +05:30
Mike Innes
46bc8e5e64 move pullbacks to CuArrays 2019-09-26 17:14:18 +01:00
bors[bot]
12bc06136d
Merge #870
870: Fix printing of SkipConnection r=MikeInnes a=mcabbott

Before:
```
julia> SkipConnection(Dense(2,2),+)
SkipConnection(Error showing value of type SkipConnection:
ERROR: MethodError: no method matching iterate(::Dense{typeof(identity),TrackedArray{…,Array{Float32,2}},TrackedArray{…,Array{Float32,1}}})

julia> SkipConnection(Chain(Dense(2,3), Dense(3,2), LayerNorm(2)),+)
SkipConnection(Dense(2, 3), Dense(3, 2), LayerNorm(2))

julia> SkipConnection(Dense(2, 3), Dense(3, 2), LayerNorm(2))
ERROR: MethodError: no method matching SkipConnection(::Dense{typeof(identity),TrackedArray{…,Array{Float32,2}},TrackedArray{…,Array{Float32,1}}}, ::Dense{typeof(identity),TrackedArray{…,Array{Float32,2}},TrackedArray{…,Array{Float32,1}}}, ::LayerNorm{TrackedArray{…,Array{Float32,1}}})
```
After:
```
julia> SkipConnection(Dense(2,2),+)
SkipConnection(Dense(2, 2), +)

julia> SkipConnection(Chain(Dense(2,3), Dense(3,2), LayerNorm(2)),+)
SkipConnection(Chain(Dense(2, 3), Dense(3, 2), LayerNorm(2)), +)

julia> SkipConnection(Dense(2,2), (a,b) -> a .+ b./2)
SkipConnection(Dense(2, 2), #9)
```

Co-authored-by: Michael Abbott <32575566+mcabbott@users.noreply.github.com>
2019-09-25 14:09:28 +00:00
Michael Abbott
806e0c5c57 line 2019-09-25 15:20:13 +02:00
Michael Abbott
4245d9acad eg 2019-09-25 15:18:40 +02:00
Michael Abbott
2de84ce79f simplify 2019-09-25 13:59:32 +02:00
Michael Abbott
1a1a96571a +Chain 2019-09-25 13:47:29 +02:00
Michael Abbott
19830c71b1 fix printing of SkipConnection 2019-09-25 13:37:01 +02:00
bors[bot]
acb6a89245
Merge #865
865: Functor r=MikeInnes a=MikeInnes

This refactors our current `@treelike` infrastructure. It somewhat formalises what we're doing around the idea of a Flux model as a functor, i.e. something that can be mapped over.

This is much more flexible than what we had before, and avoids some issues. It allows layers to have state that isn't mappable; it allows for dispatch when walking the tree, which means layers like `BatchNorm` can have non-trainable parameters; and it also allows for zipped mapping like `fmap(+, xs, ys)`, which isn't implemented yet but will be useful for the new optimisers work.

The main downside is that the term `functor` has been previously used in the Julia community as a malapropism for "thing that behaves like a function"; but hopefully this can start to reduce that usage.

Co-authored-by: Mike Innes <mike.j.innes@gmail.com>
2019-09-24 16:36:10 +00:00
bors[bot]
d57636fd48
Merge #861
861: GPU CI maintainance  r=dhairyagandhi96 a=dhairyagandhi96



Co-authored-by: Dhairya Gandhi <dhairya@juliacopmuting.com>
2019-09-24 16:06:13 +00:00
Dhairya Gandhi
ce910da948 compat julia v1.0 2019-09-24 17:04:13 +05:30
Dhairya Gandhi
cf593a5744 revert to custom target 2019-09-24 16:43:48 +05:30