Commit Graph

598 Commits

Author SHA1 Message Date
Adarsh Kumar 659ba074d1
Updated test for msle 2020-02-06 01:21:51 +05:30
Adarsh Kumar 44a977b7a4
Added tests for new loss functions 2020-02-05 23:20:06 +05:30
Tim Besard e2c2ec5575 Don't invoke GPU crossentropy with integers.
Broadcasting log on integers does not work.
2020-01-31 08:22:54 +01:00
Tim Besard e66a7f130f Don't compare CPU with GPU arrays. 2020-01-31 08:22:21 +01:00
Chris Rackauckas 9803826a36 test restructure on the GPU
Requires https://github.com/FluxML/Zygote.jl/pull/474
2020-01-20 13:53:28 -05:00
Dhairya Gandhi 29ab410794 test gradients are allocated on the gpu 2020-01-17 15:52:26 +05:30
bors[bot] d1edd9b16d
Merge #680
680: Added new loss functions. r=thebhatman a=thebhatman

I have added the KL Divergence Loss function, Poisson loss function, Logcosh loss, and Hinge loss function.

Co-authored-by: Manjunath Bhat <manjunathbhat9920@gmail.com>
Co-authored-by: thebhatman <manjunathbhat9920@gmail.com>
2020-01-13 15:46:25 +00:00
Manjunath Bhat 747e01ea02
Test to check for spurious promotions 2020-01-13 18:33:30 +05:30
Elliot Saba 0fdcc00923 Give `NNPACK` a bit of numerical leeway 2019-12-23 01:31:26 -08:00
Dhairya Gandhi b1e68813a8 cpu -> test_throws 2019-12-20 23:02:44 +05:30
Kyle Daruwalla 0cdd11c0dc Added tests for varying padding, stride, and dilation with outdims. 2019-12-07 14:05:50 -06:00
Kyle Daruwalla 6265b1fa39 Added tests for outdims 2019-12-05 22:54:25 -06:00
Dhairya Gandhi 9b6155c77d
Merge branch 'master' into dg/gradtests 2019-12-05 18:17:47 +05:30
Dhairya Gandhi 76dc8ea9d4 formatting fixes 2019-12-05 18:14:04 +05:30
Dhairya Gandhi 717ad9328d add some grad tests on GPU 2019-12-05 18:12:23 +05:30
DrChainsaw 755536bf5e Merge remote-tracking branch 'upstream/master' into samepad 2019-12-04 23:45:03 +01:00
Dhairya Gandhi ec872bb579 test that bias has no grads with Zeros 2019-11-27 19:45:04 +05:30
bors[bot] 90a38a3201
Merge #937
937: Fix Glorot initialization, add He initialization r=MikeInnes a=Sleort

Should fix #442 .
Adds He weight initialization as a bonus :-)

Co-authored-by: Troels Arnfred Bojesen <tr-ab@online.no>
2019-11-26 16:17:06 +00:00
bors[bot] fb4a48f970
Merge #943
943: Fixes #900 r=MikeInnes a=dhairyagandhi96

Thoughts on the test?

cc @MikeInnes

Co-authored-by: Dhairya Gandhi <dhairya@juliacopmuting.com>
2019-11-26 15:09:27 +00:00
Dhairya Gandhi c031ae1a94 correct channel value 2019-11-24 13:31:31 +05:30
Dhairya Gandhi 5f21238d1a no grad dims helper 2019-11-24 13:25:02 +05:30
matsueushi a0314ce682 Fix logitbinarycrossentropy on CuArrays 2019-11-22 05:23:24 +00:00
Troels Arnfred Bojesen af96a197c1 Fix Glorot initialization
Should fix #442
2019-11-20 13:20:42 +09:00
Troels Arnfred Bojesen 2b80573248 Fix Glorot initialization, add He initialization
Should fix #442 .
Adds He weight initialization as a bonus :-)
2019-11-19 18:16:29 +09:00
Troels Arnfred Bojesen 4530ac65c7 Fix Glorot initialization, add He initialization
Should fix the issue reported at https://github.com/FluxML/Flux.jl/issues/442 .
Adds He weight initialization as a bonus :-)
2019-11-19 16:50:40 +09:00
dsweber2 dea29532ef Merge branch 'master' into activations 2019-11-15 17:19:43 -08:00
bors[bot] 7eb6a0c98c
Merge #932
932: Travis: test on 1.0 r=MikeInnes a=MikeInnes



Co-authored-by: Mike J Innes <mike.j.innes@gmail.com>
Co-authored-by: Mike Innes <mike.j.innes@gmail.com>
2019-11-15 16:21:30 +00:00
Mike Innes e24215ca98 guard test on 1.0 2019-11-15 15:59:42 +00:00
dsweber2 58c794702d simpler test 2019-11-14 14:05:53 -08:00
dsweber2 db92b0e3ce super simple test 2019-11-14 13:40:52 -08:00
DrChainsaw 453ecd1f24 Merge remote-tracking branch 'upstream/master' into samepad 2019-11-08 18:49:47 +01:00
janEbert a00d8d94ec Add test for CUDA binarycrossentropy 2019-11-08 17:28:54 +01:00
Tim Besard 33d276cdb7 Fix GPU-less tests. 2019-11-04 15:51:33 +01:00
Tim Besard 39ab740fb7 Check for CUDA availability at run time. 2019-11-02 11:18:06 +01:00
Katharine Hyatt 8913c9c741 Make the vector of weights test pass on GPU 2019-10-23 09:53:09 -04:00
Katharine Hyatt f7ce717aaa Add tests 2019-10-23 09:22:22 -04:00
DrChainsaw 411ce5dbd8 Add SamePad for pooling layers 2019-10-20 13:43:39 +02:00
DrChainsaw fc123d6279 Add SamePad for conv layers 2019-10-20 13:43:23 +02:00
Dhairya Gandhi 49ea43e711 ZeroType => Zeros 2019-10-08 20:02:04 +05:30
Dhairya Gandhi b596faaffa tests bias switch 2019-10-08 17:18:39 +05:30
Dhairya Gandhi 55ef7c1aba add weight and bias kwargs 2019-10-06 04:25:23 +05:30
Dhairya Gandhi dced8c04e5 use ZeroType 2019-10-01 21:25:07 +05:30
Manjunath Bhat 2b30319a55
Merge branch 'master' into patch-6 2019-09-30 21:05:02 +05:30
thebhatman 6e289ef939 Merge branch 'patch-6' of https://github.com/thebhatman/Flux.jl into patch-6 2019-09-30 20:55:44 +05:30
Mike Innes b90b02872f Merge branch 'master' into tb/cuarrays_dnn 2019-09-27 14:58:32 +01:00
Mike Innes 691a29cf32 cudnn bug is fixed 2019-09-27 14:15:58 +01:00
Dhairya Gandhi 5ea6a33f44 make bias optional 2019-09-27 11:48:12 +05:30
bors[bot] acb6a89245
Merge #865
865: Functor r=MikeInnes a=MikeInnes

This refactors our current `@treelike` infrastructure. It somewhat formalises what we're doing around the idea of a Flux model as a functor, i.e. something that can be mapped over.

This is much more flexible than what we had before, and avoids some issues. It allows layers to have state that isn't mappable; it allows for dispatch when walking the tree, which means layers like `BatchNorm` can have non-trainable parameters; and it also allows for zipped mapping like `fmap(+, xs, ys)`, which isn't implemented yet but will be useful for the new optimisers work.

The main downside is that the term `functor` has been previously used in the Julia community as a malapropism for "thing that behaves like a function"; but hopefully this can start to reduce that usage.

Co-authored-by: Mike Innes <mike.j.innes@gmail.com>
2019-09-24 16:36:10 +00:00
Dhairya Gandhi 822288d63d merge conflicts 2019-09-24 00:31:44 +05:30
Mike Innes b60df53ba1 pkg up 2019-09-19 18:33:33 +01:00
Mike Innes cabb81e30b internal rename 2019-09-19 15:53:31 +01:00
Mike Innes b951377426 fix normalisation layer params 2019-09-19 15:33:24 +01:00
Mike Innes f8d5d3b5fc broken normalisation layer params 2019-09-19 14:12:11 +01:00
Mike Innes c5e56b7e04 move setweights and copy_transpose 2019-09-17 17:22:35 +01:00
Mike Innes b348b20452 cudnn rnns + implicit gradients 2019-09-17 15:41:42 +01:00
Mike Innes fe57215b7e test fillarray gradients 2019-09-17 15:21:03 +01:00
Dhairya Gandhi b8d872d842 update to Flux 0.9+ 2019-09-11 21:11:02 +05:30
Mike Innes 250aef5a5a normalise test fixes 2019-09-10 16:19:55 +01:00
Mike Innes 877415be10 rm gradient checks 2019-09-10 15:35:52 +01:00
Mike Innes 221313c977 formatting changed on 1.1 2019-09-10 15:26:51 +01:00
Mike Innes c8d460ff84 doctests passing 2019-09-10 15:02:43 +01:00
Mike J Innes 67c38b3099 Merge branch 'master' into zygote 2019-09-06 15:18:58 +01:00
Mike J Innes 3c1ac84676
Merge pull request #842 from baggepinnen/patch-4
Add RADAM optimizer
2019-09-02 14:36:40 +01:00
Mike J Innes 61a8cfd6ee libcudnn check fix 2019-08-27 15:41:23 +01:00
Tim Besard 6ad3cdd138 Replace Requires with direct CuArrays dependency. 2019-08-27 09:33:15 +02:00
Mike Innes ee74f1a311 pkg up 2019-08-22 13:02:59 +01:00
Mike Innes 487000ac31 fix cuda code and tests 2019-08-19 16:56:48 +01:00
Mike Innes 2f7ad895aa test cleanups 2019-08-19 15:22:50 +01:00
Mike Innes 9590aa63e3 rm last uses of param/data 2019-08-19 15:14:42 +01:00
Fredrik Bagge Carlson 304b433daa
Add RADAM to tests 2019-08-19 13:01:14 +08:00
thebhatman a128a7718d gradients test updated in cudnn 2019-07-16 17:27:35 +05:30
Manjunath Bhat 4ef5ec0005
brackets corrected 2019-07-12 21:03:57 +05:30
thebhatman 8d6028e27a tests with gradients 2019-07-12 20:47:43 +05:30
Mike Innes e2bf46b7fd gpu test fixes 2019-07-12 14:52:01 +01:00
Manjunath Bhat 2b379d0ec0
Allow scalar indexing or onehotbatch tests will fail 2019-07-12 17:56:47 +05:30
DrChainsaw 9b96a3d69b Change to array due to "type definition not allowed inside a local scope" 2019-07-09 01:15:55 +02:00
DrChainsaw 16d5f2bc24 Add x to seen in prefor to avoid infinite recursion if passed something self-referential 2019-07-08 23:11:35 +02:00
thebhatman 8292cfd81f Decay checking test added back 2019-07-03 00:30:16 +05:30
thebhatman 517219ba23 Renamed gradients test file 2019-07-02 16:13:42 +05:30
thebhatman 9f6793d63a Project.toml and Manifest updated 2019-07-02 12:16:24 +05:30
thebhatman 618f8a03c8 Hopefully the tests pass 2019-06-20 00:46:11 +05:30
thebhatman f1bf39977b nograd defined for sleep 2019-06-20 00:38:24 +05:30
thebhatman e6d5846e49 Temporary removal of Float16 test 2019-06-14 23:24:31 +05:30
thebhatman ce6a1bf84f Modifying tests in curnn.jl 2019-06-13 18:45:37 +05:30
thebhatman 80c680c598 Updated tests in cudnn.jl 2019-06-13 18:44:46 +05:30
thebhatman 25f74d1b4a Modified tests in cuda.jl 2019-06-13 18:44:17 +05:30
thebhatman 1ff4e3188e back on mse failing for Float16 2019-06-13 16:41:25 +05:30
thebhatman c7c0ee2cbc Resolving Merge Conflicts 2019-06-12 21:34:42 +05:30
thebhatman a56cfb73c3 BatchNorm test corrected 2019-06-11 20:34:48 +05:30
thebhatman f465665c73 Corrected test for asymmetric padding 2019-06-11 20:20:00 +05:30
thebhatman 94a2d1987d Updated tests of normalisation layers. 2019-06-11 20:05:07 +05:30
thebhatman a782524a0e Temporarily removed tests of cudnn and curnn. 2019-06-10 18:29:55 +05:30
thebhatman 0ddb5f0265 Tests for Optimisers supporting Zygote 2019-06-06 04:09:17 +05:30
Mike J Innes b98075817c
Merge branch 'master' into DenseBlock 2019-06-05 14:27:47 +01:00
ayush-1506 98a027a505 typo 2019-05-14 02:56:12 -07:00
ayush-1506 bfc5bb0079 rebase 2019-05-14 02:53:48 -07:00
ayush-1506 0a2e288c3f another small test 2019-05-14 02:53:06 -07:00
ayush-1506 2161163a82 added crosscor 2019-05-14 02:52:28 -07:00
ayush-1506 7c28f7f883 Merge branch 'crosscor' of https://github.com/ayush-1506/Flux.jl into crosscor 2019-05-14 02:47:28 -07:00
Bruno Hebling Vieira c5fc2fb9a3 Added tests 2019-05-13 16:32:00 -03:00