Commit Graph

42 Commits

Author SHA1 Message Date
janEbert
3dceef427f Fix binarycrossentropy on CuArrays 2019-11-08 16:48:11 +01:00
bors[bot]
645aa04464
Merge #898
898: Fix problem in crossentropy breaking GPU compilation r=MikeInnes a=kshyatt

Trying to run this simple example
```
using Flux, CuArrays
using Flux: crossentropy
model = Chain(
        Dense(728, 128, σ),
        LSTM(128, 256),
        LSTM(256, 128),
        Dense(128, 10),
        softmax) |> gpu
data = [rand(728) for i in 1:100];
out  = [rand(10) for i in 1:100];
loss(x, y) = crossentropy(model(x), y);
Flux.train!(loss, params(model), zip(gpu.(data), gpu.(out)), ADAM())
```
Old version of `crossentropy`:
```
ERROR: GPU compilation of #23(CuArrays.CuKernelState, CUDAnative.CuDeviceArray{Float32,1,CUDAnative.AS.Global}, Base.Broadcast.Broadcasted{Nothing,Tuple{Base.OneTo{Int64}},typeof(*),Tuple{Base.Broadcast.Extruded{Array{Float32,1},Tuple{Bool},Tuple{Int64}},Base.Broadcast.Broadcasted{Base.Broadcast.ArrayStyle{CuArray},Nothing,typeof(conj),Tuple{Base.Broadcast.Extruded{CUDAnative.CuDeviceArray{Float32,1,CUDAnative.AS.Global},Tuple{Bool},Tuple{Int64}}}}}}) failed
KernelError: passing and using non-bitstype argument

Argument 4 to your kernel function is of type Base.Broadcast.Broadcasted{Nothing,Tuple{Base.OneTo{Int64}},typeof(*),Tuple{Base.Broadcast.Extruded{Array{Float32,1},Tuple{Bool},Tuple{Int64}},Base.Broadcast.Broadcasted{Base.Broadcast.ArrayStyle{CuArray},Nothing,typeof(conj),Tuple{Base.Broadcast.Extruded{CUDAnative.CuDeviceArray{Float32,1,CUDAnative.AS.Global},Tuple{Bool},Tuple{Int64}}}}}}.
That type is not isbits, and such arguments are only allowed when they are unused by the kernel.  .args is of type Tuple{Base.Broadcast.Extruded{Array{Float32,1},Tuple{Bool},Tuple{Int64}},Base.Broadcast.Broadcasted{Base.Broadcast.ArrayStyle{CuArray},Nothing,typeof(conj),Tuple{Base.Broadcast.Extruded{CUDAnative.CuDeviceArray{Float32,1,CUDAnative.AS.Global},Tuple{Bool},Tuple{Int64}}}}} which is not isbits.
    .1 is of type Base.Broadcast.Extruded{Array{Float32,1},Tuple{Bool},Tuple{Int64}} which is not isbits.
      .x is of type Array{Float32,1} which is not isbits.


Stacktrace:
 [1] check_invocation(::CUDAnative.CompilerJob, ::LLVM.Function) at /mnt/home/khyatt/.julia/dev/CUDAnative/src/compiler/validation.jl:70
 [2] macro expansion at /mnt/home/khyatt/.julia/dev/CUDAnative/src/compiler/driver.jl:187 [inlined]
 [3] macro expansion at /mnt/home/khyatt/.julia/packages/TimerOutputs/7zSea/src/TimerOutput.jl:216 [inlined]
 [4] #codegen#136(::Bool, ::Bool, ::Bool, ::Bool, ::Bool, ::typeof(CUDAnative.codegen), ::Symbol, ::CUDAnative.CompilerJob) at /mnt/home/khyatt/.julia/dev/CUDAnative/src/compiler/driver.jl:186
 [5] #codegen at ./none:0 [inlined]
 [6] #compile#135(::Bool, ::Bool, ::Bool, ::Bool, ::Bool, ::typeof(CUDAnative.compile), ::Symbol, ::CUDAnative.CompilerJob) at /mnt/home/khyatt/.julia/dev/CUDAnative/src/compiler/driver.jl:47
 [7] #compile#134 at ./none:0 [inlined]
 [8] #compile at ./none:0 [inlined] (repeats 2 times)
 [9] macro expansion at /mnt/home/khyatt/.julia/dev/CUDAnative/src/execution.jl:389 [inlined]
 [10] #cufunction#176(::Nothing, ::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::typeof(CUDAnative.cufunction), ::GPUArrays.var"#23#24", ::Type{Tuple{CuArrays.CuKernelState,CUDAnative.CuDeviceArray{Float32,1,CUDAnative.AS.Global},Base.Broadcast.Broadcasted{Nothing,Tuple{Base.OneTo{Int64}},typeof(*),Tuple{Base.Broadcast.Extruded{Array{Float32,1},Tuple{Bool},Tuple{Int64}},Base.Broadcast.Broadcasted{Base.Broadcast.ArrayStyle{CuArray},Nothing,typeof(conj),Tuple{Base.Broadcast.Extruded{CUDAnative.CuDeviceArray{Float32,1,CUDAnative.AS.Global},Tuple{Bool},Tuple{Int64}}}}}}}}) at /mnt/home/khyatt/.julia/dev/CUDAnative/src/execution.jl:357
 [11] cufunction(::Function, ::Type) at /mnt/home/khyatt/.julia/dev/CUDAnative/src/execution.jl:357
 [12] macro expansion at /mnt/home/khyatt/.julia/dev/CUDAnative/src/execution.jl:174 [inlined]
 [13] macro expansion at ./gcutils.jl:91 [inlined]
 [14] macro expansion at /mnt/home/khyatt/.julia/dev/CUDAnative/src/execution.jl:171 [inlined]
 [15] _gpu_call(::CuArrays.CuArrayBackend, ::Function, ::CuArray{Float32,1}, ::Tuple{CuArray{Float32,1},Base.Broadcast.Broadcasted{Nothing,Tuple{Base.OneTo{Int64}},typeof(*),Tuple{Base.Broadcast.Extruded{Array{Float32,1},Tuple{Bool},Tuple{Int64}},Base.Broadcast.Broadcasted{Base.Broadcast.ArrayStyle{CuArray},Nothing,typeof(conj),Tuple{Base.Broadcast.Extruded{CuArray{Float32,1},Tuple{Bool},Tuple{Int64}}}}}}}, ::Tuple{Tuple{Int64},Tuple{Int64}}) at /mnt/home/khyatt/.julia/dev/CuArrays/src/gpuarray_interface.jl:60
 [16] gpu_call at /mnt/home/khyatt/.julia/dev/GPUArrays/src/abstract_gpu_interface.jl:151 [inlined]
 [17] gpu_call at /mnt/home/khyatt/.julia/dev/GPUArrays/src/abstract_gpu_interface.jl:128 [inlined]
 [18] copyto! at /mnt/home/khyatt/.julia/dev/GPUArrays/src/broadcast.jl:48 [inlined]
 [19] copyto! at ./broadcast.jl:863 [inlined]
 [20] copy at ./broadcast.jl:839 [inlined]
 [21] materialize at ./broadcast.jl:819 [inlined]
 [22] (::Zygote.var"#1310#1311"{CuArray{Float32,1},CuArray{Float32,1}})(::Array{Float32,1}) at /mnt/home/khyatt/.julia/dev/Zygote/src/lib/broadcast.jl:68
```
New version:
```
julia> Flux.train!(loss, params(model), zip(gpu.(data), gpu.(out)), ADAM())

julia> # everyone finished happily and went on with their lives
```

Co-authored-by: Katharine Hyatt <khyatt@flatironinstitute.org>
2019-10-23 14:31:53 +00:00
Katharine Hyatt
e0c1c0e057 Fix problem in crossentropy breaking GPU compilation 2019-10-22 14:00:57 -04:00
Katharine Hyatt
b8b4bc48b9 Backticks and examples for normalise 2019-10-21 10:31:44 -04:00
Mike J Innes
c70276ddfe rm more deprecations 2019-05-02 18:57:52 -07:00
pshashk
b0a5844afb Remove dims=1 from normalise (#619)
* remove `dims=1`

* add dims arg

* fix test

* remove dims=1 only from deprecated version
2019-02-11 16:11:47 +00:00
pshashk
b074b2491a
fix docstring 2019-02-08 21:49:53 +03:00
pshashk
c3e04392d8
drop dims type restriction 2019-02-08 16:15:37 +03:00
pshashk
911c901294
dims kwarg 2019-02-08 16:00:32 +03:00
Moksh Jain
046f7b4eae fix std arguments in normalise 2019-02-05 18:36:04 +05:30
Moksh Jain
c6409d7686 add support for n-dimensional input to normalise layer 2019-02-05 17:09:22 +05:30
Kristoffer Carlsson
fd0f1c7a82
use uncorrected standard deviation in normalise
fixes https://github.com/FluxML/Flux.jl/issues/529
2019-01-30 17:42:19 +01:00
Mike J Innes
1cf37ab9eb rm some old deprecations 2019-01-25 09:54:32 +00:00
Kristoffer Carlsson
c74aa67c5d fix promotion by avoiding integer division in mse and crossentropy
oops

add tests
2019-01-15 14:15:05 +01:00
James Bradbury
e7783ace12 1.0 compat for normalise 2018-09-06 18:38:11 -07:00
Mike Innes
5a023a9ccc WIP 1.0 support
closes #353
2018-08-20 13:08:04 +01:00
Matthew Kelley
864d72eef5 Overload Base.eps() for TrackedReal 2018-06-26 23:55:43 -06:00
Matthew Kelley
0e95be3326 Call Flux.Tracker.data() on ŷ for bce 2018-06-26 14:48:51 -06:00
Matthew Kelley
ed032cdb1e Change epsilon value to eps(ŷ) 2018-06-26 12:29:06 -06:00
Matthew Kelley
e08fd7a6d2 Added epsilon term to binarycrossentropy 2018-06-26 11:43:16 -06:00
Mike J Innes
8f73dc6e14 fix gpu cross entropy 2018-04-17 17:56:47 +01:00
Mike J Innes
8019f789f8 use normal log 2018-03-01 16:35:49 +00:00
Mike J Innes
ac57fc3c26 use @ fix in a few places 2018-03-01 16:31:20 +00:00
boathit
6e65789828 Register back! for logsigmoid and implement (logit)binarycrossentropy 2018-02-06 19:32:46 +08:00
Mike J Innes
e3a688e706 use kwarg 2017-12-13 15:27:15 +00:00
Elliot Saba
41446d547f Add weighted_crossentropy for imbalanced classification problems 2017-12-05 17:09:05 -08:00
Mike J Innes
dc1f08a709
Merge pull request #98 from FluxML/log
GPU-ready log function
2017-11-23 17:17:39 +00:00
Mike J Innes
b06884b912 LayerNorm tweaks 2017-11-21 16:32:36 +01:00
skariel
11d53781b2 adding layer normalization 2017-11-21 16:30:24 +01:00
Mike J Innes
e0657d93ec mv numeric.jl to nnlib 2017-11-09 15:06:29 +00:00
Mike J Innes
2cb94981a0 gpu-ready log 2017-11-09 15:04:01 +00:00
Mike J Innes
23674b2555 logitcrossentropy tweaks 2017-10-17 17:58:32 +01:00
pevnak
4aa7741ba9 logit cross entropy 2017-10-17 17:57:46 +01:00
Mike J Innes
6dff8ca8d3 rename crossentropy loss 2017-10-17 17:36:18 +01:00
Mike J Innes
949fd9ba97 loss function tweaks 2017-10-17 17:30:11 +01:00
Mike J Innes
f2052739c1 tweaks 2017-09-12 14:11:03 +01:00
Mike J Innes
9ce0439943 better mse 2017-08-24 11:40:51 +01:00
Mike J Innes
e4e9794f5e loss function gradients 2017-08-23 17:50:43 +01:00
Mike J Innes
ef681f16ea use nnlib for activations 2017-08-21 17:53:04 +01:00
Mike J Innes
18e69b33c9 forwarddiff does these 2017-08-19 22:05:50 +01:00
Mike J Innes
ad0e0ea5a7 explicitly broadcast sigmoid 2017-08-19 22:04:47 +01:00
Mike J Innes
4a9dc40e7c simplify organisation 2017-08-19 20:52:29 +01:00