Commit Graph

1324 Commits

Author SHA1 Message Date
Avik Pal
2664a16556 Update as per new AD 2018-07-13 14:12:46 +05:30
Avik Pal
0aabf9d86b
Merge branch 'master' into depthwiseconv 2018-07-13 14:04:19 +05:30
Mike J Innes
a0fd91b866
Merge pull request #307 from jarvist/master
Add ADAMW "Fixing Weight Decay Regularization in Adam"
2018-07-11 19:12:58 +01:00
Mike J Innes
6d8e6c0440
Merge pull request #313 from FluxML/ad-overhaul
AD Overhaul
2018-07-11 15:33:02 +01:00
Mike J Innes
dda51a0140 update docs 2018-07-11 15:31:22 +01:00
Mike Innes
10a169bb77 update cudnn rnn 2018-07-10 18:16:37 +01:00
Mike J Innes
70b5efeb4e basic nested AD 2018-07-10 09:03:09 +01:00
Mike J Innes
80af9a3830 broadcast efficiency 2018-07-09 23:40:07 +01:00
Mike J Innes
e763c342ee shave some memory 2018-07-09 19:44:14 +01:00
Mike J Innes
1430053b69 checkpoints 2018-07-09 17:52:34 +01:00
Mike J Innes
7778d17884 functional API 2018-07-09 16:57:44 +01:00
Mike J Innes
5e319c7395 fix gradient definitions 2018-07-09 13:39:10 +01:00
Mike J Innes
41b9412439 new grad api 2018-07-09 13:36:46 +01:00
Avik Pal
84f977c804 Remove comment 2018-07-09 13:35:30 +05:30
Jarvist Moore Frost
344a750770 Merge branch 'master' of github.com:jarvist/Flux.jl into HEAD 2018-07-03 11:15:43 +01:00
Jarvist Moore Frost
aee4a83c55 Add ADAMW weight-decay.
See http://www.fast.ai/2018/07/02/adam-weight-decay/ and the original
paper https://arxiv.org/abs/1711.05101.pdf for context.

I don't know what I'm doing, and this is quite possibly wrong - but on
a simple Char-RNN I have lying around on my harddisk, this seems to
improve the rate of learning consistently for different hyperparameters
vs. standard ADAM with the same decay constant.
2018-07-03 11:11:32 +01:00
Mike J Innes
ce88273880 gradient hook 2018-07-02 13:19:13 +01:00
Mike Innes
5d8b63dc65 avoid implementation details in docs 2018-06-29 13:53:50 +01:00
Mike J Innes
d76e790818
Merge pull request #306 from maetshju/pull-request/e08fd7a6
Add epsilon term to binarycrossentropy
2018-06-27 08:52:08 +01:00
Matthew Kelley
864d72eef5 Overload Base.eps() for TrackedReal 2018-06-26 23:55:43 -06:00
Matthew Kelley
0e95be3326 Call Flux.Tracker.data() on ŷ for bce 2018-06-26 14:48:51 -06:00
Matthew Kelley
ed032cdb1e Change epsilon value to eps(ŷ) 2018-06-26 12:29:06 -06:00
Matthew Kelley
e08fd7a6d2 Added epsilon term to binarycrossentropy 2018-06-26 11:43:16 -06:00
Mike J Innes
bed6d2311e clearer docs 2018-06-26 16:07:58 +01:00
Mike J Innes
88c16e62dd fixes #284 2018-06-26 15:09:26 +01:00
Mike J Innes
836e3872b6 style 2018-06-26 15:09:21 +01:00
Mike J Innes
2723c9ee04
Merge pull request #257 from staticfloat/sf/back_inf_nan
Check for `Inf` and `NaN` within `back!(::TrackedReal)`
2018-06-26 14:42:33 +01:00
Mike J Innes
d6a75e1289 add activations docs 2018-06-26 14:35:03 +01:00
Mike J Innes
0a04e3ba61 Chain activations 2018-06-26 14:30:46 +01:00
Mike J Innes
4d7548b7a3 Merge commit '1490d87d8387a078a29a336cb37fd7269240179e' 2018-06-26 14:25:36 +01:00
Mike J Innes
1490d87d83 tweaks 2018-06-26 14:25:24 +01:00
Kade
aa8f79f10c Mention CUDAnative.jl's install instructions 2018-06-26 14:22:50 +01:00
Mike J Innes
134ac1586b
Merge pull request #237 from tejank10/scalar_pad_stride
Scalar pad and stride
2018-06-26 14:18:12 +01:00
Mike J Innes
7726a5b605 inferrable 2018-06-26 14:12:57 +01:00
Mike J Innes
3b575930ca Merge branch 'master' into scalar_pad_stride 2018-06-26 14:05:07 +01:00
Mike Innes
7e3cf45ee4 better error 2018-06-25 11:36:52 +01:00
Mike J Innes
aea1e73cde scalar gradients 2018-06-21 13:12:42 +01:00
Mike J Innes
ac1448f677
Update README.md 2018-06-13 11:13:48 +01:00
Avik Pal
85158d632b Comment out the test 2018-06-11 16:00:20 +05:30
Avik Pal
5c6b066bd9 Merge branch 'depthwiseconv' of https://github.com/avik-pal/Flux.jl into depthwiseconv 2018-06-11 15:41:21 +05:30
Avik Pal
65f2c33991
Merge pull request #2 from FluxML/master
rebase
2018-06-11 15:40:57 +05:30
Avik Pal
4a639687de Typo 2018-06-09 18:59:54 +05:30
Avik Pal
6b294736f9 Add Depthwise Convolution is Docs 2018-06-09 14:19:47 +05:30
Mike J Innes
8f7ee76752
Merge pull request #290 from tejank10/patch-1
Default value of dilation
2018-06-09 08:55:35 +01:00
Avik Pal
b59da95786 Merge branch 'depthwiseconv' of https://github.com/avik-pal/Flux.jl into depthwiseconv 2018-06-09 13:11:42 +05:30
Avik Pal
5d7ee884b8 Fix error while backpropagatio 2018-06-09 13:04:49 +05:30
Avik Pal
7f3d11cae0
Merge branch 'master' into depthwiseconv 2018-06-09 11:06:07 +05:30
Avik Pal
1d93fb8e59 Add new constructor and fix a typo in display 2018-06-09 11:02:15 +05:30
Tejan Karmali
d20771d6be
Default value of dilation
dilation should be 1 by default
2018-06-09 02:29:46 +05:30
Mike J Innes
9345607c38
Merge pull request #224 from tejank10/nadam-opt
NADAM optimizer
2018-06-08 12:28:07 +01:00