Mike J Innes
70718e7a64
update treelike
2018-08-03 13:02:47 +01:00
Mike J Innes
d782b33701
syntax
2018-08-03 13:02:47 +01:00
Mike J Innes
85fd77d70a
linalg deprecations
2018-08-03 13:02:47 +01:00
Mike J Innes
89872c5a8b
val deprecations
2018-08-03 13:02:47 +01:00
Mike J Innes
474f578517
ObjectIdDict -> IdDict
2018-08-03 13:02:47 +01:00
Mike J Innes
aa209ee137
no longer needed
2018-08-03 13:02:47 +01:00
Mike J Innes
00cfe24d66
fix cat
2018-08-03 13:02:47 +01:00
Mike J Innes
adc216f182
fix broadcasting
2018-08-03 12:56:32 +01:00
Mike J Innes
e486c50610
fix data
2018-08-03 12:56:31 +01:00
Mike J Innes
fb8a220659
fix matmul
2018-08-03 12:56:31 +01:00
Mike J Innes
7057ca739e
fix std usage
2018-08-03 12:56:27 +01:00
Mike J Innes
88a265154c
deprecations
2018-08-03 12:54:31 +01:00
Mike J Innes
b18b51656c
requires update
2018-08-03 12:54:24 +01:00
Mike J Innes
a49e2eae41
deprecated Void
2018-08-03 12:53:52 +01:00
Mike J Innes
1fd49c2a90
fix array show
2018-08-03 12:53:52 +01:00
Mike J Innes
a8ccc79f61
perf hacks
2018-07-30 20:08:44 +01:00
Mike J Innes
a0fd91b866
Merge pull request #307 from jarvist/master
...
Add ADAMW "Fixing Weight Decay Regularization in Adam"
2018-07-11 19:12:58 +01:00
Mike J Innes
dda51a0140
update docs
2018-07-11 15:31:22 +01:00
Mike Innes
10a169bb77
update cudnn rnn
2018-07-10 18:16:37 +01:00
Mike J Innes
70b5efeb4e
basic nested AD
2018-07-10 09:03:09 +01:00
Mike J Innes
80af9a3830
broadcast efficiency
2018-07-09 23:40:07 +01:00
Mike J Innes
e763c342ee
shave some memory
2018-07-09 19:44:14 +01:00
Mike J Innes
1430053b69
checkpoints
2018-07-09 17:52:34 +01:00
Mike J Innes
7778d17884
functional API
2018-07-09 16:57:44 +01:00
Mike J Innes
5e319c7395
fix gradient definitions
2018-07-09 13:39:10 +01:00
Mike J Innes
41b9412439
new grad api
2018-07-09 13:36:46 +01:00
Jarvist Moore Frost
344a750770
Merge branch 'master' of github.com:jarvist/Flux.jl into HEAD
2018-07-03 11:15:43 +01:00
Jarvist Moore Frost
aee4a83c55
Add ADAMW weight-decay.
...
See http://www.fast.ai/2018/07/02/adam-weight-decay/ and the original
paper https://arxiv.org/abs/1711.05101.pdf for context.
I don't know what I'm doing, and this is quite possibly wrong - but on
a simple Char-RNN I have lying around on my harddisk, this seems to
improve the rate of learning consistently for different hyperparameters
vs. standard ADAM with the same decay constant.
2018-07-03 11:11:32 +01:00
Mike J Innes
ce88273880
gradient hook
2018-07-02 13:19:13 +01:00
Mike Innes
5d8b63dc65
avoid implementation details in docs
2018-06-29 13:53:50 +01:00
Matthew Kelley
864d72eef5
Overload Base.eps() for TrackedReal
2018-06-26 23:55:43 -06:00
Matthew Kelley
0e95be3326
Call Flux.Tracker.data() on ŷ for bce
2018-06-26 14:48:51 -06:00
Matthew Kelley
ed032cdb1e
Change epsilon value to eps(ŷ)
2018-06-26 12:29:06 -06:00
Matthew Kelley
e08fd7a6d2
Added epsilon term to binarycrossentropy
2018-06-26 11:43:16 -06:00
Mike J Innes
88c16e62dd
fixes #284
2018-06-26 15:09:26 +01:00
Mike J Innes
836e3872b6
style
2018-06-26 15:09:21 +01:00
Mike J Innes
2723c9ee04
Merge pull request #257 from staticfloat/sf/back_inf_nan
...
Check for `Inf` and `NaN` within `back!(::TrackedReal)`
2018-06-26 14:42:33 +01:00
Mike J Innes
0a04e3ba61
Chain activations
2018-06-26 14:30:46 +01:00
Mike J Innes
7726a5b605
inferrable
2018-06-26 14:12:57 +01:00
Mike J Innes
3b575930ca
Merge branch 'master' into scalar_pad_stride
2018-06-26 14:05:07 +01:00
Mike Innes
7e3cf45ee4
better error
2018-06-25 11:36:52 +01:00
Mike J Innes
aea1e73cde
scalar gradients
2018-06-21 13:12:42 +01:00
Tejan Karmali
d20771d6be
Default value of dilation
...
dilation should be 1 by default
2018-06-09 02:29:46 +05:30
Tejan Karmali
4a24b69976
Merge branch 'master' into nadam-opt
2018-06-08 16:54:41 +05:30
Mike J Innes
4915b0c8dd
Merge pull request #268 from staticfloat/patch-2
...
Add `dilation` kwarg to `Conv`
2018-06-07 13:49:02 +01:00
Mike J Innes
af8f3348eb
Merge pull request #270 from staticfloat/sf/tracked_repeat
...
Add `TrackedArray` support for `repeat(x; inner, outer)`
2018-06-06 17:34:58 +01:00
Mike Innes
2370bdbe91
see #205
2018-06-06 17:01:28 +01:00
staticfloat@gmail.com
f390a39d77
Add TrackedArray
support for repeat(x; inner, outer)
2018-05-22 17:41:05 -07:00
Elliot Saba
e6efca4bf4
Add dilation
kwarg to Conv
...
Now that we have dilated convolution support in `NNlib`, this is enables support in Flux's `Conv` layer.
2018-05-21 13:44:13 -07:00
James Bradbury
af12f006f2
Use broadcast for dropout
...
Should be fast enough on GPU now that it's not going to be an optimization target again for a while. Hopefully isn't meaningfully slower on CPU?
2018-05-20 04:04:33 -07:00