Commit Graph

2478 Commits

Author SHA1 Message Date
Mike J Innes
00cfe24d66 fix cat 2018-08-03 13:02:47 +01:00
Mike J Innes
adc216f182 fix broadcasting 2018-08-03 12:56:32 +01:00
Mike J Innes
e486c50610 fix data 2018-08-03 12:56:31 +01:00
Mike J Innes
fb8a220659 fix matmul 2018-08-03 12:56:31 +01:00
Mike J Innes
7057ca739e fix std usage 2018-08-03 12:56:27 +01:00
Mike J Innes
88a265154c deprecations 2018-08-03 12:54:31 +01:00
Mike J Innes
b18b51656c requires update 2018-08-03 12:54:24 +01:00
Mike J Innes
a49e2eae41 deprecated Void 2018-08-03 12:53:52 +01:00
Mike J Innes
1fd49c2a90 fix array show 2018-08-03 12:53:52 +01:00
Yueh-Hua Tu
5b37319289 Add Maxpool and Meanpool 2018-08-01 00:10:53 +08:00
Mike J Innes
a8ccc79f61 perf hacks 2018-07-30 20:08:44 +01:00
Avik Pal
2cc0f112f1 Updates 2018-07-27 20:12:49 +05:30
Avik Pal
7b2982493a Merge branch 'master' of https://github.com/FluxML/Flux.jl 2018-07-21 09:58:24 +05:30
Mike J Innes
c565317d9e
Merge pull request #328 from jjerphan/patch-1
Very Little typo.
2018-07-19 12:35:58 +01:00
Julien Jerphanion
34d0c39e72
Ditto. 2018-07-19 00:14:02 +02:00
Julien Jerphanion
ee630a8566
Very Little typo. 2018-07-18 23:20:43 +02:00
Avik Pal
b4626c20be Merge branch 'master' of https://github.com/FluxML/Flux.jl 2018-07-17 18:50:23 +05:30
Avik Pal
7dd5ec16c9 Fix 2018-07-17 11:22:12 +05:30
Avik Pal
531ecccd38 Error statement 2018-07-17 10:14:23 +05:30
Avik Pal
4035641f00 Remove imports 2018-07-17 10:06:26 +05:30
Avik Pal
8874d9cccd Fix GPU test 2018-07-17 09:53:39 +05:30
Avik Pal
da7fe93b31 Fix test 2018-07-17 09:47:45 +05:30
Avik Pal
0bb3eaa1f6 Update CUDNN Batchnorm with new Flux AD 2018-07-17 09:40:20 +05:30
Avik Pal
646db81f94 Pull BatchNorm CPU updates 2018-07-17 09:24:38 +05:30
CarloLucibello
071dcdda87 update docs 2018-07-16 07:32:13 +02:00
CarloLucibello
185e9148b6 fix cpu batchnorm 2018-07-16 07:11:33 +02:00
Avik Pal
f57db22abe Remove unnecessary file 2018-07-13 14:27:04 +05:30
Avik Pal
2664a16556 Update as per new AD 2018-07-13 14:12:46 +05:30
Avik Pal
0aabf9d86b
Merge branch 'master' into depthwiseconv 2018-07-13 14:04:19 +05:30
Mike J Innes
a0fd91b866
Merge pull request #307 from jarvist/master
Add ADAMW "Fixing Weight Decay Regularization in Adam"
2018-07-11 19:12:58 +01:00
Mike J Innes
6d8e6c0440
Merge pull request #313 from FluxML/ad-overhaul
AD Overhaul
2018-07-11 15:33:02 +01:00
Mike J Innes
dda51a0140 update docs 2018-07-11 15:31:22 +01:00
Mike Innes
10a169bb77 update cudnn rnn 2018-07-10 18:16:37 +01:00
Mike J Innes
70b5efeb4e basic nested AD 2018-07-10 09:03:09 +01:00
Mike J Innes
80af9a3830 broadcast efficiency 2018-07-09 23:40:07 +01:00
Mike J Innes
e763c342ee shave some memory 2018-07-09 19:44:14 +01:00
Mike J Innes
1430053b69 checkpoints 2018-07-09 17:52:34 +01:00
Mike J Innes
7778d17884 functional API 2018-07-09 16:57:44 +01:00
Mike J Innes
5e319c7395 fix gradient definitions 2018-07-09 13:39:10 +01:00
Mike J Innes
41b9412439 new grad api 2018-07-09 13:36:46 +01:00
Avik Pal
84f977c804 Remove comment 2018-07-09 13:35:30 +05:30
Avik Pal
b239fc684e Update tests 2018-07-04 18:57:43 +05:30
Avik Pal
c38d4edef7 Merge branch 'master' of https://github.com/FluxML/Flux.jl 2018-07-04 07:31:45 +05:30
Jarvist Moore Frost
344a750770 Merge branch 'master' of github.com:jarvist/Flux.jl into HEAD 2018-07-03 11:15:43 +01:00
Jarvist Moore Frost
aee4a83c55 Add ADAMW weight-decay.
See http://www.fast.ai/2018/07/02/adam-weight-decay/ and the original
paper https://arxiv.org/abs/1711.05101.pdf for context.

I don't know what I'm doing, and this is quite possibly wrong - but on
a simple Char-RNN I have lying around on my harddisk, this seems to
improve the rate of learning consistently for different hyperparameters
vs. standard ADAM with the same decay constant.
2018-07-03 11:11:32 +01:00
Mike J Innes
ce88273880 gradient hook 2018-07-02 13:19:13 +01:00
Mike Innes
5d8b63dc65 avoid implementation details in docs 2018-06-29 13:53:50 +01:00
Avik Pal
e3b10691d2 make cache optional param 2018-06-28 15:27:59 +05:30
Avik Pal
bcf094451c Fix typo 2018-06-28 14:45:35 +05:30
Avik Pal
d0b79e71e2 fix load error 2018-06-28 14:27:50 +05:30