Avik Pal
4035641f00
Remove imports
2018-07-17 10:06:26 +05:30
Avik Pal
0bb3eaa1f6
Update CUDNN Batchnorm with new Flux AD
2018-07-17 09:40:20 +05:30
Avik Pal
646db81f94
Pull BatchNorm CPU updates
2018-07-17 09:24:38 +05:30
CarloLucibello
071dcdda87
update docs
2018-07-16 07:32:13 +02:00
CarloLucibello
185e9148b6
fix cpu batchnorm
2018-07-16 07:11:33 +02:00
Mike J Innes
a0fd91b866
Merge pull request #307 from jarvist/master
...
Add ADAMW "Fixing Weight Decay Regularization in Adam"
2018-07-11 19:12:58 +01:00
Mike J Innes
dda51a0140
update docs
2018-07-11 15:31:22 +01:00
Mike Innes
10a169bb77
update cudnn rnn
2018-07-10 18:16:37 +01:00
Mike J Innes
70b5efeb4e
basic nested AD
2018-07-10 09:03:09 +01:00
Mike J Innes
80af9a3830
broadcast efficiency
2018-07-09 23:40:07 +01:00
Mike J Innes
e763c342ee
shave some memory
2018-07-09 19:44:14 +01:00
Mike J Innes
1430053b69
checkpoints
2018-07-09 17:52:34 +01:00
Mike J Innes
7778d17884
functional API
2018-07-09 16:57:44 +01:00
Mike J Innes
5e319c7395
fix gradient definitions
2018-07-09 13:39:10 +01:00
Mike J Innes
41b9412439
new grad api
2018-07-09 13:36:46 +01:00
Jarvist Moore Frost
344a750770
Merge branch 'master' of github.com:jarvist/Flux.jl into HEAD
2018-07-03 11:15:43 +01:00
Jarvist Moore Frost
aee4a83c55
Add ADAMW weight-decay.
...
See http://www.fast.ai/2018/07/02/adam-weight-decay/ and the original
paper https://arxiv.org/abs/1711.05101.pdf for context.
I don't know what I'm doing, and this is quite possibly wrong - but on
a simple Char-RNN I have lying around on my harddisk, this seems to
improve the rate of learning consistently for different hyperparameters
vs. standard ADAM with the same decay constant.
2018-07-03 11:11:32 +01:00
Mike J Innes
ce88273880
gradient hook
2018-07-02 13:19:13 +01:00
Mike Innes
5d8b63dc65
avoid implementation details in docs
2018-06-29 13:53:50 +01:00
Avik Pal
e3b10691d2
make cache optional param
2018-06-28 15:27:59 +05:30
Avik Pal
bcf094451c
Fix typo
2018-06-28 14:45:35 +05:30
Avik Pal
d0b79e71e2
fix load error
2018-06-28 14:27:50 +05:30
Avik Pal
7ac9e191cb
Revert 1 change
2018-06-28 14:25:22 +05:30
Avik Pal
5ccde88ce6
Minor fix for 5D support
2018-06-28 14:21:17 +05:30
Avik Pal
681d8c4dfc
Remove cache
2018-06-28 12:11:32 +05:30
Avik Pal
8f43258ab7
Get the batchnorm working without cache
2018-06-28 12:04:25 +05:30
Avik Pal
4916c8e6da
Add treelike for now
2018-06-27 14:54:49 +05:30
Matthew Kelley
864d72eef5
Overload Base.eps() for TrackedReal
2018-06-26 23:55:43 -06:00
Matthew Kelley
0e95be3326
Call Flux.Tracker.data() on ŷ for bce
2018-06-26 14:48:51 -06:00
Matthew Kelley
ed032cdb1e
Change epsilon value to eps(ŷ)
2018-06-26 12:29:06 -06:00
Matthew Kelley
e08fd7a6d2
Added epsilon term to binarycrossentropy
2018-06-26 11:43:16 -06:00
Mike J Innes
88c16e62dd
fixes #284
2018-06-26 15:09:26 +01:00
Mike J Innes
836e3872b6
style
2018-06-26 15:09:21 +01:00
Mike J Innes
2723c9ee04
Merge pull request #257 from staticfloat/sf/back_inf_nan
...
Check for `Inf` and `NaN` within `back!(::TrackedReal)`
2018-06-26 14:42:33 +01:00
Mike J Innes
0a04e3ba61
Chain activations
2018-06-26 14:30:46 +01:00
Mike J Innes
7726a5b605
inferrable
2018-06-26 14:12:57 +01:00
Mike J Innes
3b575930ca
Merge branch 'master' into scalar_pad_stride
2018-06-26 14:05:07 +01:00
Mike Innes
7e3cf45ee4
better error
2018-06-25 11:36:52 +01:00
Avik Pal
24ba1c4e6c
Make changes as per the review
2018-06-23 11:02:41 +05:30
Mike J Innes
aea1e73cde
scalar gradients
2018-06-21 13:12:42 +01:00
Avik Pal
91850a8baf
Add missing path to curnn.jl
2018-06-20 18:46:42 +05:30
Avik Pal
deb4950261
Make cuDNN take only 4D arrays
2018-06-20 15:54:38 +05:30
Avik Pal
3339ad5181
Integrate cudnn BatchNorm with Flux
2018-06-20 15:50:30 +05:30
Avik Pal
714ca23aba
Change default value of epsilon to prevent CuDNN BatchNorm warnings
2018-06-20 12:11:22 +05:30
Avik Pal
185f34d9fe
Add working backward pass
2018-06-20 12:09:54 +05:30
Avik Pal
bc47d02b3f
Remove uncessary imports
2018-06-17 12:40:01 +05:30
Avik Pal
af5ab7f9ef
Fix Tensor Descriptor Bug
2018-06-17 12:28:02 +05:30
Avik Pal
c6dcf079ce
Update file structure and make function calls correct
2018-06-17 11:47:49 +05:30
Avik Pal
24d13ac326
Fix missing parenthesis
2018-06-12 21:32:56 +05:30
Avik Pal
f12e367cab
Adding untested backward pass code
2018-06-12 18:26:09 +05:30