Commit Graph

46 Commits

Author SHA1 Message Date
Shreyas
671aed963e Made a few fixes. Added tests 2019-03-28 00:51:50 +05:30
Shreyas
595f1cf6eb Made Requested Changes 2019-03-26 21:42:49 +05:30
Manjunath Bhat
c6e51f5cc2
Made lambda and alpha of eltype(x) 2019-03-07 23:42:38 +05:30
Manjunath Bhat
47c1324476
Merge branch 'master' into patch-3 2019-03-07 23:08:40 +05:30
Manjunath Bhat
1d310d4532
Removed {typeof(p)} 2019-03-07 21:55:26 +05:30
thebhatman
f4543b7adf Value of alpha updated and dot operations changed 2019-03-08 03:21:26 +05:30
David Pollack
7b9b64f1cb change IN to in 2019-03-07 09:46:44 +01:00
David Pollack
83b4b3a714 changes based on PR comments 2019-03-07 09:46:44 +01:00
David Pollack
c41f891005 changes based on the improved batchnorm in PR#633 2019-03-07 09:46:44 +01:00
David Pollack
129a708b6f instance normalization 2019-03-07 09:46:44 +01:00
thebhatman
8e5965ac41 Indentation fixed 2019-03-05 16:28:05 +05:30
thebhatman
d6608682fc Suggested changes made 2019-03-05 16:18:50 +05:30
Manjunath Bhat
29b853e0bb
Made sure Gradients are not lost. 2019-03-04 22:17:19 +05:30
Manjunath Bhat
97f874abcf
Added AlphaDropout which is used in SNNs. 2019-03-04 01:05:46 +05:30
Sklan
7463f09591
Update normalise.jl 2019-02-21 23:56:19 +05:30
Sklan
6044421c5c
Update normalise.jl 2019-02-20 13:47:31 +05:30
KristofferC
9914c531f6 work around extreme slowdown due julia performance bug 2019-02-06 16:19:29 +01:00
Avik Pal
bb72c528e1 Change initializers to Float32 2019-01-24 18:43:39 +05:30
Avik Pal
02efc264e7 Fix unintentional change to spaces 2018-11-08 19:12:38 +05:30
Avik Pal
7d06f654f0 Fix tests 2018-09-11 16:58:05 +05:30
Avik Pal
c4f87ff15c Minor fixes: 2018-09-11 16:21:55 +05:30
Avik Pal
7e83852862 Fixes 2018-09-11 15:58:17 +05:30
Avik Pal
8bea60d980
Merge branch 'master' into cudnn_batchnorm 2018-09-11 15:34:25 +05:30
Avik Pal
d3c78a80be Fix layers errors 2018-08-11 17:20:27 +05:30
Avik Pal
4bd13c448f Add updates for julia0.7 2018-08-11 15:23:40 +05:30
Avik Pal
3b448ce1ac
Merge branch 'master' into cudnn_batchnorm 2018-08-11 15:02:55 +05:30
Mike J Innes
7103a0ed7d tweaks 2018-08-03 15:19:10 +01:00
pevnak
926411a449 removed most error, the only one in Fallbacks test persits 2018-08-03 15:14:25 +01:00
pevnak
3510c837a8 zeros replaced by zero 2018-08-03 15:14:25 +01:00
Mike J Innes
70718e7a64 update treelike 2018-08-03 13:02:47 +01:00
Mike J Innes
88a265154c deprecations 2018-08-03 12:54:31 +01:00
Avik Pal
7dd5ec16c9 Fix 2018-07-17 11:22:12 +05:30
Avik Pal
646db81f94 Pull BatchNorm CPU updates 2018-07-17 09:24:38 +05:30
CarloLucibello
071dcdda87 update docs 2018-07-16 07:32:13 +02:00
CarloLucibello
185e9148b6 fix cpu batchnorm 2018-07-16 07:11:33 +02:00
Avik Pal
681d8c4dfc Remove cache 2018-06-28 12:11:32 +05:30
Avik Pal
8f43258ab7 Get the batchnorm working without cache 2018-06-28 12:04:25 +05:30
Avik Pal
4916c8e6da Add treelike for now 2018-06-27 14:54:49 +05:30
Avik Pal
3339ad5181 Integrate cudnn BatchNorm with Flux 2018-06-20 15:50:30 +05:30
Avik Pal
714ca23aba Change default value of epsilon to prevent CuDNN BatchNorm warnings 2018-06-20 12:11:22 +05:30
James Bradbury
af12f006f2
Use broadcast for dropout
Should be fast enough on GPU now that it's not going to be an optimization target again for a while. Hopefully isn't meaningfully slower on CPU?
2018-05-20 04:04:33 -07:00
Mike J Innes
baff20514d gpu broadcast fix 2018-04-17 18:05:58 +01:00
Mike Innes
d12fb98f2a nicer batchnorm shape error 2018-04-15 20:29:25 +01:00
Mike J Innes
5fd240f525 interface tweaks 2018-04-15 20:04:42 +01:00
Mike J Innes
73a0be3e04 Merge branch 'master' into pull-request/07b0f95d 2018-04-15 17:10:29 +01:00
Mike J Innes
cb3ae8df6a rename normalise.jl 2018-04-15 15:45:46 +01:00