Commit Graph

62 Commits

Author SHA1 Message Date
Mike J Innes
4f73e434a4
Merge pull request #935 from baggepinnen/patch-4
Fix AMSGrad on GPU
2019-11-19 12:58:37 +00:00
Fredrik Bagge Carlson
2da22f31f0
Avoid unnecessary conversion
This initialization works for both cpu and gpu
2019-11-19 16:31:04 +08:00
Fredrik Bagge Carlson
df7ffb0ef8
Fix AMSGrad on GPU
The previous initialization created a CPU array. Now, the same type of array as `x` is created.
2019-11-19 16:27:44 +08:00
Mike Innes
7ead2d6c7b typo 2019-10-22 13:36:39 +01:00
Dhairya Gandhi
4477dd8d54 reviews 2019-10-10 20:27:11 +05:30
Dhairya Gandhi
f19066ee29 more docstrings 2019-10-10 16:48:12 +05:30
Dhairya Gandhi
fe52689cfe in depth docstrings 2019-10-09 16:16:11 +05:30
Dhairya Gandhi
b503741651 expanded docstrings 2019-10-04 14:46:03 +05:30
Dhairya Gandhi
8013c728b1 clearer optimiser docstrings 2019-09-28 16:09:00 +05:30
Dhairya Gandhi
0175485a80 fixup 2019-09-27 22:08:25 +05:30
Dhairya Gandhi
8bb0db7d0c opt docstrings 2019-09-27 22:04:53 +05:30
Mike J Innes
67c38b3099 Merge branch 'master' into zygote 2019-09-06 15:18:58 +01:00
Fredrik Bagge Carlson
ebbad0d135
Add RADAM optimizer 2019-08-19 12:22:32 +08:00
Christopher Rackauckas
ed12d4e7c0
Momentum doesn't need params 2019-07-31 17:56:51 -04:00
thebhatman
8d78b437ff Merge branch 'sf/zygote_updated' of https://github.com/thebhatman/Flux.jl 2019-07-08 18:47:17 +05:30
Dhairya Gandhi
dd9cdbef14 remove uncessary call to beta 2019-06-16 19:09:50 +05:30
Dhairya Gandhi
67f18663d9 pick beta from state in NADAM 2019-06-16 19:06:59 +05:30
Mike J Innes
c313be8e95 rm data/param 2019-05-02 18:52:09 -07:00
Dhairya Gandhi
221670a2b1
Merge pull request #733 from thebhatman/expdecay-fix
Fixed ExpDecay
2019-05-01 18:58:37 +05:30
Hossein Pourbozorg
7f06b15f67 use https instead of http for web links 2019-04-25 11:04:03 +00:00
thebhatman
31a50ab16a Fixed ExpDecay 2019-04-11 17:28:06 +05:30
Mike Innes
4cf43c0c41 simpler/nicer training loop 2019-02-28 14:58:42 +00:00
Mike J Innes
0f2975d905 update -> apply 2019-01-28 13:59:23 +00:00
Mike J Innes
f6397e7358
Merge pull request #517 from FluxML/fix_adamw
Fix decay argument in ADAMW
2019-01-18 10:06:23 +00:00
Mike J Innes
f0d5624ed2
Merge pull request #493 from dhairyagandhi96/master
[WIP] New Optimiser Docs
2019-01-10 11:10:38 +00:00
Dhairya Gandhi
e48268ff06 fix argument name in ADAMW 2018-12-12 16:47:42 +05:30
Dhairya Gandhi
1ea8c5a293 [WIP] add docstrings and doc improvements 2018-11-12 19:17:10 +05:30
Joel Mason
29832aca92 Move some epsilons about 2018-11-02 22:59:04 +11:00
Mike J Innes
bffaceee02 tweaks 2018-10-31 14:58:55 +00:00
Dhairya Gandhi
bebf4eb95f fixed ExpDecay update! rule 2018-10-29 23:12:24 +05:30
Dhairya Gandhi
32ce2d78b8 fixed ExpDecay test 2018-10-27 19:53:06 +05:30
Dhairya Gandhi
815e8c206d decay fixes 2018-10-27 19:26:42 +05:30
Dhairya Gandhi
1f0f2a5ac2 fixed DescentWeightDecay parameters 2018-10-11 10:21:29 +05:30
Dhairya Gandhi
d8394298bb fix merge conflicts 2018-10-11 10:15:59 +05:30
Dhairya Gandhi
fe8c147f72 fixed weight decay definition 2018-10-11 10:07:16 +05:30
Mike Innes
bfe85e65f1 compose tweaks 2018-10-05 13:52:26 +01:00
Mike Innes
0f2019eba5 compose tweaks 2018-10-05 12:57:03 +01:00
Mike Innes
9bc9771a8d tweaks 2018-10-05 12:43:03 +01:00
Dhairya Gandhi
b661db3797 added deprecations and compose 2018-10-01 05:30:53 +05:30
Dhairya Gandhi
6665189ff1 added remaining optimizers and tests 2018-09-16 17:34:51 +05:30
Dhairya Gandhi
63bc71698b updated tests 2018-09-14 20:32:56 +05:30
Dhairya Gandhi
d933f2079b pulled tracker from upstream 2018-09-11 18:30:24 +05:30
Mike J Innes
a2d2d068aa initial sketch 2018-08-28 17:55:59 +05:30
pevnak
3510c837a8 zeros replaced by zero 2018-08-03 15:14:25 +01:00
Jarvist Moore Frost
344a750770 Merge branch 'master' of github.com:jarvist/Flux.jl into HEAD 2018-07-03 11:15:43 +01:00
Jarvist Moore Frost
aee4a83c55 Add ADAMW weight-decay.
See http://www.fast.ai/2018/07/02/adam-weight-decay/ and the original
paper https://arxiv.org/abs/1711.05101.pdf for context.

I don't know what I'm doing, and this is quite possibly wrong - but on
a simple Char-RNN I have lying around on my harddisk, this seems to
improve the rate of learning consistently for different hyperparameters
vs. standard ADAM with the same decay constant.
2018-07-03 11:11:32 +01:00
Tejan Karmali
4a24b69976
Merge branch 'master' into nadam-opt 2018-06-08 16:54:41 +05:30
Sujeet Akula
8c042bd522
element wise max() 2018-04-26 21:12:31 +10:00
Sujeet Akula
b6508e2416
add adamax 2018-04-26 17:37:24 +10:00
tejank10
65847bb745 moved epsilon into sqrt 2018-04-04 15:25:20 +05:30