Commit Graph

38 Commits

Author SHA1 Message Date
Mike J Innes
24a6569589 Merge branch 'master' into amsgrad 2017-12-08 18:20:53 +00:00
baggepinnen
41febee9c1 Export and indent 2017-12-04 09:34:27 +01:00
baggepinnen
36001d085a Implement AMSGrad optimiser 2017-12-04 09:17:05 +01:00
CarloLucibello
13b934c250 improve optimizers 2017-11-24 12:12:20 +01:00
Mike J Innes
9f5c4dd3e9
Merge pull request #104 from baggepinnen/patch-1
Allow array of optimisers to train!
2017-11-21 17:16:35 +01:00
Mike J Innes
979949d01a style 2017-11-21 15:25:09 +01:00
Fredrik Bagge Carlson
8991ce028c
Fix bug in rmsprop and adadelta
`@. p.Δ = η * p.Δ / √acc` parses correctly while `@. p.Δ /= √acc*η` seems to parse like `@. p.Δ /= (√acc*η)`, hence the step size was de facto interpreted as `1/η`
2017-11-14 17:32:16 +01:00
Fredrik Bagge Carlson
97244e0a68
Allow array of optimisers to train!
This allows an array of optimisers to be sent to `train!`
2017-11-04 13:27:32 +01:00
Mike J Innes
99a7697d13 adam eta default arg 2017-10-19 14:31:34 +01:00
Mike J Innes
5b6a5667ed tracked array restructure 2017-10-18 22:54:58 +01:00
Mike J Innes
07ad7cfa40 learning rate as default arg 2017-10-18 17:07:49 +01:00
Mike J Innes
7426faf37d optimiser docs 2017-10-18 12:09:48 +01:00
CarloLucibello
041079237e add docsting to train! 2017-10-17 21:04:18 +01:00
CarloLucibello
6d3a2a2210 change argument name for better clarity 2017-10-17 21:04:18 +01:00
GenaBitu
ef6d10886d Exposed all optimisers 2017-10-06 14:20:09 +01:00
pevnak
bfcc1ac25d exposing optimisers 2017-10-05 12:36:18 +01:00
Mike J Innes
5fd1b7d9a2 remove gc hack 2017-10-02 20:50:18 +01:00
Mike J Innes
7c8dba0b85 gc in training loop 2017-09-27 23:14:58 +01:00
Mike J Innes
120a6db2bb Merge branch 'master' of github.com:MikeInnes/Flux.jl 2017-09-27 21:16:23 +01:00
Mike J Innes
4bafa2b374 generic tree functions 2017-09-27 21:11:21 +01:00
Mike J Innes
94e38c05b8 more informative 2017-09-27 18:33:23 +01:00
Mike J Innes
f2052739c1 tweaks 2017-09-12 14:11:03 +01:00
Mike J Innes
3f83be7bb7 more flexible training loop 2017-09-11 13:11:55 +01:00
Mike J Innes
085d3aa9b4 handle epoch elsewhere 2017-09-07 00:29:55 -04:00
Mike J Innes
aeaa138b6d cb convenience 2017-09-07 00:27:16 -04:00
Mike J Innes
cca4d25a10 efficient traversal 2017-09-06 23:09:32 -04:00
Mike J Innes
47ba702747 tweak optimiser interface 2017-09-03 17:10:04 -04:00
Mike J Innes
e57ae77bbb juno progress 2017-09-03 02:44:32 -04:00
Mike J Innes
fe2b35facc add callbacks back 2017-09-01 23:59:44 -04:00
Mike J Innes
bf098d551c fuck 2017-09-01 23:41:44 -04:00
Mike J Innes
387686eb41 optimisers rework 2017-09-01 17:06:51 -04:00
Mike J Innes
b95dae1868 opt refactor 2017-08-31 14:55:23 -04:00
ylxdzsw
97ecb26003 wip optimisers 2017-08-29 17:00:24 -04:00
Mike J Innes
12dc6b66c5 whoops 2017-08-24 22:23:05 +01:00
Mike J Innes
e7f26370d7 training tweaks 2017-08-24 16:10:04 +01:00
Mike J Innes
1526b13691 basic training loop 2017-08-24 11:42:29 +01:00
Mike J Innes
bafecfede1 sgd 2017-08-22 22:25:18 +01:00
Mike J Innes
0ce8c0cee4 param collection 2017-08-22 17:13:03 +01:00