Commit Graph

2081 Commits

Author SHA1 Message Date
Mike Innes
c9663c1e71 pkg up 2019-07-12 14:51:42 +01:00
Manjunath Bhat
2b379d0ec0
Allow scalar indexing or onehotbatch tests will fail 2019-07-12 17:56:47 +05:30
Mike Innes
33c8d84a60 cuparam -> cuarray 2019-07-11 14:14:56 +01:00
Manjunath Bhat
11c9a8450c
Remove active from GroupNorm 2019-07-11 18:40:48 +05:30
Mike Innes
c2cd7dab91 re-export gradient 2019-07-11 13:55:12 +01:00
thebhatman
cf5bc801d3 Check for nothing in update step 2019-07-08 19:22:23 +05:30
thebhatman
8d78b437ff Merge branch 'sf/zygote_updated' of https://github.com/thebhatman/Flux.jl 2019-07-08 18:47:17 +05:30
Mike J Innes
b3bba4c566
Merge pull request #801 from quatrejuin/master
Fix lack of x
2019-07-08 13:00:58 +01:00
thebhatman
812541f8d6 zeros replaced by fill to avoid nothing grad 2019-07-06 19:41:03 +05:30
thebhatman
8292cfd81f Decay checking test added back 2019-07-03 00:30:16 +05:30
Jason Wu
b24e05bb20
Fix lack of x 2019-07-02 13:15:54 -04:00
thebhatman
4e9f3deb7f Manifest updated with new Zygote version 2019-07-02 20:41:44 +05:30
thebhatman
3ee2a76f61 Removed .data from LSTMCell 2019-07-02 17:38:30 +05:30
thebhatman
517219ba23 Renamed gradients test file 2019-07-02 16:13:42 +05:30
thebhatman
9f6793d63a Project.toml and Manifest updated 2019-07-02 12:16:24 +05:30
Viral B. Shah
5689b39538
Create FUNDING.yml 2019-06-26 17:51:54 -04:00
Mike J Innes
e88440974b
Merge pull request #796 from dhairyagandhi96/nadam
Pick beta from the state - NADAM
2019-06-19 22:18:56 +01:00
thebhatman
618f8a03c8 Hopefully the tests pass 2019-06-20 00:46:11 +05:30
thebhatman
f1bf39977b nograd defined for sleep 2019-06-20 00:38:24 +05:30
thebhatman
b194e7e3a8 Callback being called now 2019-06-20 00:37:54 +05:30
Dhairya Gandhi
dd9cdbef14 remove uncessary call to beta 2019-06-16 19:09:50 +05:30
Dhairya Gandhi
67f18663d9 pick beta from state in NADAM 2019-06-16 19:06:59 +05:30
thebhatman
e6d5846e49 Temporary removal of Float16 test 2019-06-14 23:24:31 +05:30
thebhatman
7ab9d8ed3d Minor update 2019-06-13 18:59:03 +05:30
thebhatman
ce6a1bf84f Modifying tests in curnn.jl 2019-06-13 18:45:37 +05:30
thebhatman
80c680c598 Updated tests in cudnn.jl 2019-06-13 18:44:46 +05:30
thebhatman
25f74d1b4a Modified tests in cuda.jl 2019-06-13 18:44:17 +05:30
thebhatman
1ff4e3188e back on mse failing for Float16 2019-06-13 16:41:25 +05:30
thebhatman
ce11804dc1 CrossCor test passing, hopefully. 2019-06-13 01:21:58 +05:30
thebhatman
48ed93cdaa Silly error in Dropout corrected. 2019-06-12 23:16:15 +05:30
thebhatman
e9797408ec DepthwiseConv corrected again. 2019-06-12 23:01:51 +05:30
thebhatman
00a4f4c26d Correcting Dropout 2019-06-12 22:39:30 +05:30
thebhatman
bd7e3b1f41 Dropout with dims test passing. 2019-06-12 22:16:11 +05:30
thebhatman
c7c0ee2cbc Resolving Merge Conflicts 2019-06-12 21:34:42 +05:30
Dhairya Gandhi
b47238eb74
Merge pull request #793 from amellnik/typos
Two minor typos in docs
2019-06-12 11:31:06 +05:30
Alex Mellnik
e17999f19b Two minor typos 2019-06-11 22:09:59 -07:00
thebhatman
dfd2965e85 GroupNorm tests corrected 2019-06-11 22:32:54 +05:30
thebhatman
11073dcd25 GroupNorm made to use istraining() 2019-06-11 22:04:33 +05:30
thebhatman
a56cfb73c3 BatchNorm test corrected 2019-06-11 20:34:48 +05:30
thebhatman
f465665c73 Corrected test for asymmetric padding 2019-06-11 20:20:00 +05:30
thebhatman
94a2d1987d Updated tests of normalisation layers. 2019-06-11 20:05:07 +05:30
thebhatman
a782524a0e Temporarily removed tests of cudnn and curnn. 2019-06-10 18:29:55 +05:30
thebhatman
ef63f80644 No ops defined for param and data 2019-06-10 18:24:18 +05:30
thebhatman
0ddb5f0265 Tests for Optimisers supporting Zygote 2019-06-06 04:09:17 +05:30
bors[bot]
1902c0e7c5 Merge #446
446: Added the SkipConnection layer and constructor r=MikeInnes a=bhvieira

I added a DenseBlock constructor, which allows one to train DenseNets (you can train ResNets and MixNets with this as well, only need change the connection, which is concatenation for DenseNets).

Disclaimer: I created the block for a 3D U-Net, so the assumption here is that whatever layer is inside the block, its output has the same spatial dimension (i.e. all array dimensions excluding the channel and minibatch dimensions) as the input, otherwise the connection wouldn't match. I'm not sure this matches the topology of every DenseNet there is out there, but I suppose this is a good starting point.

No tests yet, will add them as the PR evolve.

I'm open to suggestions! :)


Co-authored-by: Bruno Hebling Vieira <bruno.hebling.vieira@usp.br>
Co-authored-by: Mike J Innes <mike.j.innes@gmail.com>
2019-06-05 13:28:41 +00:00
Mike J Innes
b98075817c
Merge branch 'master' into DenseBlock 2019-06-05 14:27:47 +01:00
bors[bot]
8ee6af1bee Merge #762
762: CrossCor layer r=avik-pal a=ayush-1506

Same as #423 (which could be edited since I lost access to that github account).

Co-authored-by: ayush-1506 <ayush.shridhar1506@gmail.com>
2019-05-14 10:36:22 +00:00
ayush-1506
98a027a505 typo 2019-05-14 02:56:12 -07:00
ayush-1506
bfc5bb0079 rebase 2019-05-14 02:53:48 -07:00
ayush-1506
f263f0c8ed add to layer docs 2019-05-14 02:53:06 -07:00