Commit Graph

1713 Commits

Author SHA1 Message Date
cossio
1dbaf32810 DataLoader type inference tests 2020-06-16 13:32:27 +02:00
cossio
cb34bb848b simplify _getobs 2020-06-16 13:32:27 +02:00
cossio
75692161a7 Apply suggestions from code review
accept suggested changes

Co-authored-by: Carlo Lucibello <carlo.lucibello@gmail.com>
2020-06-16 13:32:27 +02:00
cossio
909a55ac10 news and docs 2020-06-16 13:32:27 +02:00
cossio
02ee6ba426 DataLoader with NamedTuple 2020-06-16 13:31:29 +02:00
Lyndon White
df84628c29 Require weight and bias to be AbstractArrays 2020-06-10 12:06:57 +01:00
bors[bot]
e1f80d4627
Merge #1213
1213: Fixing indentation in train! docstring r=CarloLucibello a=natema

One code block is not correctly displayed in the doc of [Flux.Optimise.train!
](https://fluxml.ai/Flux.jl/stable/training/training/#Flux.Optimise.train!). 
Based on the previous code block, I guess it's an indentation problem.


Co-authored-by: natema <natema@users.noreply.github.com>
2020-06-08 18:29:46 +00:00
bors[bot]
a7bbd3d35b
Merge #1152
1152: extend dataloader r=CarloLucibello a=CarloLucibello

cfr discussion in #1149. Currently DataLoader interface supports

1. `for x in DataLoader(X)`
2. `for (x, y) in DataLoader(X, Y)`

This PR adds

3. `for (x,) in DataLoader((X,))`
4. `for (x, y) in DataLoader((X, Y))`

Edit:
the constructor in 2. is removed in this PR

Co-authored-by: CarloLucibello <carlo.lucibello@gmail.com>
2020-06-08 18:01:06 +00:00
CarloLucibello
0cf46432cf cleanup 2020-06-08 19:59:34 +02:00
natema
70bbf18180
Fixing indentation in train! docstring
One code block is not correctly displayed in the doc of [Flux.Optimise.train!
](https://fluxml.ai/Flux.jl/stable/training/training/#Flux.Optimise.train!). 
Based on the previous code block, I guess it's an indentation problem.
2020-06-07 15:44:04 +02:00
bors[bot]
d9b07475b0
Merge #1129
1129: Added dropgrad in huber_loss r=CarloLucibello a=HenriDeh

Workaround to prevent `iterate(::nothing)` when working with CuArrays. See issue #1128

Co-authored-by: HenriDeh <47037088+HenriDeh@users.noreply.github.com>
2020-06-06 17:21:19 +00:00
bors[bot]
9ebbe8cb4c
Merge #1141
1141: Speedup matmul of CuMatrix and OneHotMatrix r=CarloLucibello a=AStupidBear

This solves #189.

```julia
julia> using Flux


julia> using Flux: CuArrays

julia> A = zeros(300, 10000) |> gpu;

julia> B = Flux.onehotbatch(rand(1:10000, 256), 1:10000) |> gpu;

julia> A * B; CuArrays.@time A * B;
┌ Warning: Performing scalar operations on GPU arrays: This is very slow, consider disallowing these operations with `allowscalar(false)`
└ @ GPUArrays ~/shared/.julia/packages/GPUArrays/OXvxB/src/host/indexing.jl:43
  0.002824 seconds (951 CPU allocations: 38.156 KiB) (2 GPU allocations: 301.000 KiB, 2.32% gc time of which 46.42% spent allocating)

julia> import Base: *

julia> A::AbstractMatrix * B::Flux.OneHotMatrix = @inbounds A[:, map(x->x.ix, B.data)]
* (generic function with 522 methods)

julia> A * B; CuArrays.@time A * B;
  0.000343 seconds (169 CPU allocations: 5.000 KiB) (2 GPU allocations: 301.000 KiB, 15.53% gc time of which 65.97% spent allocating)
```

Co-authored-by: Yao Lu <luyaocns@gmail.com>
2020-06-06 17:00:01 +00:00
CarloLucibello
a643cb6758 extend dataloader 2020-06-06 18:02:03 +02:00
natema
8f6aed5770
Fixing syntax in onehot docstring
`otherwise, it will error` -> `otherwise, it will raise an error`
2020-06-05 18:20:50 +02:00
Mike J Innes
089ec0832c improved restructure adjoint 2020-05-27 12:28:22 +01:00
bors[bot]
bd152ca099
Merge #1177
1177: Align ExpDecay implementation with documentation r=dhairyagandhi96 a=DrChainsaw

Fix for #1176 



Co-authored-by: DrChainsaw <Christian.kyril.skarby@gmail.com>
2020-05-21 14:33:20 +00:00
bors[bot]
87ba651add
Merge #1165
1165: Fix docstring of logitcrossentropy r=dhairyagandhi96 a=cossio

Since `y` is a logit, there is no log (see the diff).

Co-authored-by: cossio <cossio@users.noreply.github.com>
2020-05-19 11:07:15 +00:00
bors[bot]
b6a5dd7152
Merge #1133
1133: add ClipValue and ClipNorm r=CarloLucibello a=AStupidBear



Co-authored-by: Yao Lu <luyaocns@gmail.com>
2020-05-15 17:15:07 +00:00
Yao Lu
007586858c fix export merge conflict 2020-05-14 17:13:35 +08:00
DrChainsaw
e8433d0abe Align ExpDecay implementation with documentation 2020-05-12 22:50:17 +02:00
Mike J Innes
f5a8900ffb xlogy broadcast adjoint 2020-05-12 17:29:35 +01:00
Mike J Innes
bd43201f37
fix logitcrossentropy doc string 2020-05-12 16:18:29 +01:00
bors[bot]
a84e08cf28
Merge #1174
1174: Functors r=MikeInnes a=MikeInnes

Just splits out the implementation to the [Functors](https://github.com/FluxML/Functors.jl) package, so the same traits can be used elsewhere (e.g. Optimisers.jl) without depending on all of Flux.

Co-authored-by: Mike J Innes <mike.j.innes@gmail.com>
2020-05-12 14:39:08 +00:00
Yao Lu
5a9eb7411a cpu 2020-05-10 14:39:48 +08:00
Yao Lu
888f286c51 use @inbounds 2020-05-09 19:40:46 +08:00
Yao Lu
63cb70dd23 remove importing CuMatrix 2020-05-09 19:13:52 +08:00
Yao Lu
30648910c8 transfer onehot indices back to cpu 2020-05-09 19:10:46 +08:00
cossio
feb72d400a NaN 2020-05-07 12:44:32 +02:00
cossio
86d6555269 cufunc 2020-05-07 09:58:33 +02:00
cossio
8314200c51 generic 2020-05-05 19:23:05 +02:00
cossio
480473a81b xlogy 2020-05-05 18:33:50 +02:00
cossio
9e1fd883d5
Fix docstring of logitbinarycrossentropy and logitcrossentropy 2020-05-05 16:29:29 +02:00
Mike J Innes
8f877f2dbf quick fix 2020-05-01 14:22:46 +01:00
Dhairya Gandhi
29215fa5d7 comment on possible future deprecations 2020-04-29 16:17:44 +05:30
Dhairya Gandhi
534809ae78 move zeros to its own file 2020-04-29 16:15:35 +05:30
Dhairya Gandhi
5086c0f4f0 merge conflicts 2020-04-29 16:11:39 +05:30
Yao Lu
114f63a214 norm(Δ) 2020-04-26 17:28:07 +08:00
Yao Lu
eb6898ea19 speedup matmul of CuMatrix and OneHotMatrix 2020-04-25 23:22:46 +08:00
Yao Lu
7d6f711c6f Merge branch 'master' into clip 2020-04-25 22:18:58 +08:00
DrChainsaw
1544f84bb9 Fix merge conflicts 2020-04-24 21:56:26 +02:00
Yao Lu
58a72ec879 Merge branch 'master' of https://github.com/FluxML/Flux.jl into clip 2020-04-22 01:29:13 +08:00
Yao Lu
c4f5e83697 resolve conflict 2020-04-22 01:24:13 +08:00
Yao Lu
1dfec7f38b add test 2020-04-22 01:22:34 +08:00
Yao Lu
def19b058e simplify docstrings 2020-04-21 10:56:38 +08:00
Yao Lu
cc1dcd5590 rm requires 2020-04-20 20:02:29 +08:00
Yao Lu
68b84bba36 add LinearAlgebra 2020-04-20 19:54:44 +08:00
Yao Lu
ba0fca5a19 remove onehot 2020-04-20 19:45:15 +08:00
Yao Lu
b33c4b49be add ClipValue and ClipNorm 2020-04-20 19:41:10 +08:00
Yao Lu
427c55af92 speedup matmul of CuMatrix and OneHotMatrix 2020-04-20 19:11:57 +08:00
HenriDeh
ac94754281
Update stateless.jl 2020-04-18 13:23:11 +02:00