Commit Graph

2222 Commits

Author SHA1 Message Date
thebhatman
4e9f3deb7f Manifest updated with new Zygote version 2019-07-02 20:41:44 +05:30
thebhatman
3ee2a76f61 Removed .data from LSTMCell 2019-07-02 17:38:30 +05:30
thebhatman
517219ba23 Renamed gradients test file 2019-07-02 16:13:42 +05:30
thebhatman
9f6793d63a Project.toml and Manifest updated 2019-07-02 12:16:24 +05:30
Viral B. Shah
5689b39538
Create FUNDING.yml 2019-06-26 17:51:54 -04:00
Mike J Innes
e88440974b
Merge pull request #796 from dhairyagandhi96/nadam
Pick beta from the state - NADAM
2019-06-19 22:18:56 +01:00
thebhatman
618f8a03c8 Hopefully the tests pass 2019-06-20 00:46:11 +05:30
thebhatman
f1bf39977b nograd defined for sleep 2019-06-20 00:38:24 +05:30
thebhatman
b194e7e3a8 Callback being called now 2019-06-20 00:37:54 +05:30
Dhairya Gandhi
dd9cdbef14 remove uncessary call to beta 2019-06-16 19:09:50 +05:30
Dhairya Gandhi
67f18663d9 pick beta from state in NADAM 2019-06-16 19:06:59 +05:30
thebhatman
e6d5846e49 Temporary removal of Float16 test 2019-06-14 23:24:31 +05:30
thebhatman
7ab9d8ed3d Minor update 2019-06-13 18:59:03 +05:30
thebhatman
ce6a1bf84f Modifying tests in curnn.jl 2019-06-13 18:45:37 +05:30
thebhatman
80c680c598 Updated tests in cudnn.jl 2019-06-13 18:44:46 +05:30
thebhatman
25f74d1b4a Modified tests in cuda.jl 2019-06-13 18:44:17 +05:30
thebhatman
1ff4e3188e back on mse failing for Float16 2019-06-13 16:41:25 +05:30
thebhatman
ce11804dc1 CrossCor test passing, hopefully. 2019-06-13 01:21:58 +05:30
thebhatman
48ed93cdaa Silly error in Dropout corrected. 2019-06-12 23:16:15 +05:30
thebhatman
e9797408ec DepthwiseConv corrected again. 2019-06-12 23:01:51 +05:30
thebhatman
00a4f4c26d Correcting Dropout 2019-06-12 22:39:30 +05:30
thebhatman
bd7e3b1f41 Dropout with dims test passing. 2019-06-12 22:16:11 +05:30
thebhatman
c7c0ee2cbc Resolving Merge Conflicts 2019-06-12 21:34:42 +05:30
Dhairya Gandhi
b47238eb74
Merge pull request #793 from amellnik/typos
Two minor typos in docs
2019-06-12 11:31:06 +05:30
Alex Mellnik
e17999f19b Two minor typos 2019-06-11 22:09:59 -07:00
thebhatman
dfd2965e85 GroupNorm tests corrected 2019-06-11 22:32:54 +05:30
thebhatman
11073dcd25 GroupNorm made to use istraining() 2019-06-11 22:04:33 +05:30
thebhatman
a56cfb73c3 BatchNorm test corrected 2019-06-11 20:34:48 +05:30
thebhatman
f465665c73 Corrected test for asymmetric padding 2019-06-11 20:20:00 +05:30
thebhatman
94a2d1987d Updated tests of normalisation layers. 2019-06-11 20:05:07 +05:30
thebhatman
a782524a0e Temporarily removed tests of cudnn and curnn. 2019-06-10 18:29:55 +05:30
thebhatman
ef63f80644 No ops defined for param and data 2019-06-10 18:24:18 +05:30
thebhatman
0ddb5f0265 Tests for Optimisers supporting Zygote 2019-06-06 04:09:17 +05:30
bors[bot]
1902c0e7c5 Merge #446
446: Added the SkipConnection layer and constructor r=MikeInnes a=bhvieira

I added a DenseBlock constructor, which allows one to train DenseNets (you can train ResNets and MixNets with this as well, only need change the connection, which is concatenation for DenseNets).

Disclaimer: I created the block for a 3D U-Net, so the assumption here is that whatever layer is inside the block, its output has the same spatial dimension (i.e. all array dimensions excluding the channel and minibatch dimensions) as the input, otherwise the connection wouldn't match. I'm not sure this matches the topology of every DenseNet there is out there, but I suppose this is a good starting point.

No tests yet, will add them as the PR evolve.

I'm open to suggestions! :)


Co-authored-by: Bruno Hebling Vieira <bruno.hebling.vieira@usp.br>
Co-authored-by: Mike J Innes <mike.j.innes@gmail.com>
2019-06-05 13:28:41 +00:00
Mike J Innes
b98075817c
Merge branch 'master' into DenseBlock 2019-06-05 14:27:47 +01:00
Lyndon White
fe759ac43c
Update docs/src/performance.md
Co-Authored-By: Kristoffer Carlsson <kristoffer.carlsson@chalmers.se>
2019-05-28 14:19:56 +01:00
bors[bot]
8ee6af1bee Merge #762
762: CrossCor layer r=avik-pal a=ayush-1506

Same as #423 (which could be edited since I lost access to that github account).

Co-authored-by: ayush-1506 <ayush.shridhar1506@gmail.com>
2019-05-14 10:36:22 +00:00
ayush-1506
98a027a505 typo 2019-05-14 02:56:12 -07:00
ayush-1506
bfc5bb0079 rebase 2019-05-14 02:53:48 -07:00
ayush-1506
f263f0c8ed add to layer docs 2019-05-14 02:53:06 -07:00
ayush-1506
0a2e288c3f another small test 2019-05-14 02:53:06 -07:00
ayush-1506
2161163a82 added crosscor 2019-05-14 02:52:28 -07:00
ayush-1506
451b80da3d add to layer docs 2019-05-14 02:50:18 -07:00
ayush-1506
7c28f7f883 Merge branch 'crosscor' of https://github.com/ayush-1506/Flux.jl into crosscor 2019-05-14 02:47:28 -07:00
Bruno Hebling Vieira
6b3cd825b9 Added SkipConnection to docs tentatively in Other General Purporse Layers 2019-05-13 16:43:14 -03:00
Bruno Hebling Vieira
796a2957c9 Added news and removed type annotation from SkipConnection structure 2019-05-13 16:33:31 -03:00
Bruno Hebling Vieira
c5fc2fb9a3 Added tests 2019-05-13 16:32:00 -03:00
Bruno Hebling Vieira
e7d76b8423 Added the SkipConnection layer and constructor
Added missing export

Corrected channel placement

Dimension 4 cannot be assumed to always be the Channel dimension

Deprecation of `treelike`

Code now makes use of `@treelike` macro instead of the deprecated `treelike` function (it worked on my end because I'm on Julia 0.7, while Julia 1.0 deprecated stuff)

Update basic.jl

Renaming to SkipConnection

* Update Flux.jl

* Update basic.jl

Updated `SkipConnection` with a `connection` field

I'm pretty sure I broke something now, but this PR should follow along these lines `cat` needs special treatment (the user can declare his own `concatenate` connection, but I foresee it's going to be used often so we can simply define special treatment)

Forgot to remove some rebasing text

Forgot to remove some more rebasing text

Removed local copy and default cat method from the function calls

Adjusted some more types for inference, could improve on this as well

Re-placed some left-over spaces
2019-05-13 16:32:00 -03:00
Dhairya Gandhi
308b199bd0
Merge pull request #774 from zsz00/patch-1
typo of comvolutional in NEWS.md
2019-05-14 00:37:17 +05:30
zy
a27be0f9ec
typo of comvolutional
comvolutional  -> convolutional
2019-05-14 01:24:45 +08:00