Dhairya Gandhi
d933f2079b
pulled tracker from upstream
2018-09-11 18:30:24 +05:30
Mike J Innes
a2d2d068aa
initial sketch
2018-08-28 17:55:59 +05:30
Mike Innes
2ca189bc96
newlines
2018-08-28 10:54:50 +01:00
Dhairya Gandhi
2f1a9847fa
deprecate :stop from optimizers; housekeeping
2018-08-22 21:25:26 +05:30
Dhairya Gandhi
a7ad620f01
exporting stop
2018-08-22 00:33:30 +05:30
Dhairya Gandhi
ed044e2df7
changes as requested
2018-08-21 23:22:20 +05:30
Dhairya Gandhi
394b4167ce
moving stop to Optimise
2018-08-20 13:43:08 +05:30
Avik Pal
5db7a3a3ad
Fix Optimizers
2018-08-11 18:23:47 +05:30
pevnak
3510c837a8
zeros replaced by zero
2018-08-03 15:14:25 +01:00
Jarvist Moore Frost
344a750770
Merge branch 'master' of github.com:jarvist/Flux.jl into HEAD
2018-07-03 11:15:43 +01:00
Jarvist Moore Frost
aee4a83c55
Add ADAMW weight-decay.
...
See http://www.fast.ai/2018/07/02/adam-weight-decay/ and the original
paper https://arxiv.org/abs/1711.05101.pdf for context.
I don't know what I'm doing, and this is quite possibly wrong - but on
a simple Char-RNN I have lying around on my harddisk, this seems to
improve the rate of learning consistently for different hyperparameters
vs. standard ADAM with the same decay constant.
2018-07-03 11:11:32 +01:00
Tejan Karmali
4a24b69976
Merge branch 'master' into nadam-opt
2018-06-08 16:54:41 +05:30
CarloLucibello
e186b958dd
more exports
2018-05-01 12:13:14 +01:00
Sujeet Akula
5e5f255f81
export typo
2018-04-26 17:42:04 +10:00
Sujeet Akula
4586bda5ab
export/test adamax
2018-04-26 17:40:11 +10:00
tejank10
ea9b5471fa
NADAM optimizer
2018-04-03 01:27:22 +05:30
baggepinnen
36001d085a
Implement AMSGrad optimiser
2017-12-04 09:17:05 +01:00
Mike J Innes
5b6a5667ed
tracked array restructure
2017-10-18 22:54:58 +01:00
GenaBitu
ef6d10886d
Exposed all optimisers
2017-10-06 14:20:09 +01:00
Mike J Innes
4bafa2b374
generic tree functions
2017-09-27 21:11:21 +01:00
Mike J Innes
cca4d25a10
efficient traversal
2017-09-06 23:09:32 -04:00
Mike J Innes
387686eb41
optimisers rework
2017-09-01 17:06:51 -04:00
Mike J Innes
b95dae1868
opt refactor
2017-08-31 14:55:23 -04:00
Mike J Innes
12dc6b66c5
whoops
2017-08-24 22:23:05 +01:00
Mike J Innes
e7f26370d7
training tweaks
2017-08-24 16:10:04 +01:00
Mike J Innes
1526b13691
basic training loop
2017-08-24 11:42:29 +01:00
Mike J Innes
bafecfede1
sgd
2017-08-22 22:25:18 +01:00
Mike J Innes
0ce8c0cee4
param collection
2017-08-22 17:13:03 +01:00