Commit Graph

980 Commits

Author SHA1 Message Date
Mike J Innes
0f7a1ec022 test params funct 2018-02-08 16:13:20 +00:00
Mike J Innes
961de2ba44
Merge pull request #161 from FluxML/curnn
WIP: CUDNN RNNs
2018-02-08 13:06:52 +00:00
Iblis Lin
f7fdfbe3a9 fix params 2018-02-08 12:56:10 +00:00
Mike J Innes
fcbdc49d6b fix reserve usage 2018-02-08 10:27:26 +00:00
Mike J Innes
bc452fcd81 rewrite tests 2018-02-08 02:37:55 +00:00
Mike J Innes
d592f4e327 batch support 2018-02-08 01:45:48 +00:00
Mike J Innes
b8f148b012 hook up backward passes 2018-02-08 00:49:39 +00:00
Mike J Innes
a1d1930097 Merge branch 'master' into curnn 2018-02-07 23:23:02 +00:00
Mike J Innes
4511936a87 fixes #116 2018-02-07 23:21:04 +00:00
Mike J Innes
0ac924e8e1 fixups 2018-02-07 22:52:46 +00:00
Mike J Innes
39f7f8fdf3 tracked tuples 2018-02-07 22:21:42 +00:00
Mike J Innes
79e4e25fea seperate number type 2018-02-07 20:39:36 +00:00
Mike J Innes
282889970d seperate tracking infrastructure from array wrapper 2018-02-07 17:43:25 +00:00
Mike J Innes
30b3437c56 backward passes 2018-02-06 18:56:17 +00:00
Mike J Innes
f866fbe575 nullable c refactor 2018-02-06 15:01:48 +00:00
Mike J Innes
07e1b1e0a9 avoid val 2018-02-06 12:44:18 +00:00
boathit
7e37a96c6f Register back! for logsigmoid and implement (logit)binarycrossentropy 2018-02-06 19:36:16 +08:00
boathit
6e65789828 Register back! for logsigmoid and implement (logit)binarycrossentropy 2018-02-06 19:32:46 +08:00
Mike J Innes
a4bf5936b0 diagm 2018-02-05 18:29:35 +00:00
Mike J Innes
2fec75005d
Merge pull request #123 from GenaBitu/cat-fix
Added vcat for multiple TrackedVectors
2018-02-05 18:10:48 +00:00
Mike J Innes
47cebab26e test multiple inputs/dims 2018-02-05 18:09:54 +00:00
Mike J Innes
2a2475a9c2 get tracker graph 2018-02-05 17:40:07 +00:00
Mike J Innes
14086b8c2d train forward pass 2018-02-02 17:48:08 +00:00
Mike J Innes
9a6fcf057b hook up interface 2018-02-02 16:42:18 +00:00
Mike J Innes
b1c5786012 Merge branch 'master' into curnn 2018-02-02 15:56:44 +00:00
Mike J Innes
49e1e78f67 make data/value available 2018-02-02 15:56:04 +00:00
Mike J Innes
0f1e7b5578 update rnn structure 2018-02-01 20:57:39 +00:00
Mike J Innes
106502a75d typo 2018-01-31 21:57:04 +00:00
Mike J Innes
af3ccf85ff coagulate gates 2018-01-31 16:56:27 +00:00
Mike J Innes
4bfb603da6 gru forward 2018-01-31 13:46:55 +00:00
Mike J Innes
b1bb05403c basic forward pass 2018-01-30 18:18:37 +00:00
Mike J Innes
0b886507dc param offsets 2018-01-30 14:43:39 +00:00
Mike J Innes
af0c5523ff rnnTrainingReserveSize 2018-01-30 14:43:39 +00:00
Mike J Innes
3fb83d642d rnnWorkspaceSize 2018-01-30 14:43:39 +00:00
Mike J Innes
6b4e114d5d rnnParamSize 2018-01-30 14:43:39 +00:00
Mike J Innes
ee6c3e18a9 basic RNNDesc 2018-01-30 14:43:39 +00:00
Mike J Innes
842bf03051 typo 2018-01-30 14:43:05 +00:00
Mike J Innes
0c9549c469 rm lazy 2018-01-24 13:28:52 +00:00
Mike J Innes
5118ef9163 remove batching work for now 2018-01-24 13:12:38 +00:00
boathit
374d7a5f1e Registering backward function for logsoftmax 2018-01-21 15:20:59 +08:00
Mike J Innes
72eabde373 load data 2018-01-17 16:39:55 +00:00
Mike J Innes
bd57359535 docstrings 2018-01-17 16:12:12 +00:00
Mike J Innes
8cca7accf2 mnist 2018-01-17 15:55:37 +00:00
Mike J Innes
4207fb98f2 basic GPU tests 2018-01-16 17:58:14 +00:00
GenaBitu
bc8a32bc56
Merge branch 'master' into cat-fix 2018-01-16 11:01:31 +01:00
Mike J Innes
1beb30e19a closes #118 2018-01-15 17:00:47 +00:00
Mike J Innes
8f8589a7f4 fix initialisation 2018-01-10 14:11:52 +00:00
Mike J Innes
b44237468e Merge branch 'master' into gru 2018-01-10 13:59:33 +00:00
Mike J Innes
805cb9178f fixes #146 2018-01-10 12:48:50 +00:00
Mehul Tikekar
2fef799109 fix typo in conv.jl (fixes #133) 2018-01-08 16:46:58 -05:00
Mike J Innes
468f641f66 use Adapt 2018-01-08 16:34:22 +00:00
Mike J Innes
98b362729d pool padding 2017-12-18 18:18:14 +00:00
Mike J Innes
e3577d759c conv docs 2017-12-18 18:05:48 +00:00
Mike J Innes
269d8f36b9 conv padding 2017-12-18 18:05:38 +00:00
Mike J Innes
51f93d9f0e conv polish 2017-12-15 16:24:45 +00:00
Mike J Innes
386eafc443 reshape 2017-12-15 16:18:16 +00:00
Mike J Innes
73ae25289d remove old util 2017-12-15 16:18:01 +00:00
Mike J Innes
6890a61587 todo 2017-12-15 16:17:45 +00:00
Mike J Innes
9b833a4345 more onehot indexing 2017-12-15 16:17:39 +00:00
Mike J Innes
9d0dd9fb7e layer wip 2017-12-15 13:22:57 +00:00
Mike J Innes
0bf22dfb8e pool gradients 2017-12-15 02:29:14 +00:00
Mike J Innes
d949b31aa5 conv gradient 2017-12-15 02:24:32 +00:00
Mike J Innes
5b97d2ba04 closes #127 2017-12-13 18:24:56 +00:00
Mike J Innes
95d1287455 Merge branch 'master' into jacobian 2017-12-13 17:06:23 +00:00
Mike J Innes
27d896943e
Merge pull request #120 from staticfloat/sf/dense_initialization
Better default initialization for Dense layers
2017-12-13 16:18:02 +00:00
Mike J Innes
e3a688e706 use kwarg 2017-12-13 15:27:15 +00:00
Mike J Innes
128725cefd Merge branch 'master' into sf/weighted_crossentropy 2017-12-13 15:14:47 +00:00
Mike J Innes
29787eba45 fixes #114 2017-12-12 17:23:15 +00:00
Mike J Innes
b7b6c975bc fixes #110 2017-12-12 17:07:39 +00:00
Mike J Innes
403cc26327 Merge branch 'master' into gru 2017-12-12 16:54:00 +00:00
Mike J Innes
86097e76fd tweak batchnorm example 2017-12-08 19:34:34 +00:00
Mike J Innes
6f997e798a Merge branch 'master' into batchnorm 2017-12-08 19:31:50 +00:00
Mike J Innes
1d916c81b5 Merge branch 'master' into HEAD 2017-12-08 18:31:55 +00:00
Mike J Innes
24a6569589 Merge branch 'master' into amsgrad 2017-12-08 18:20:53 +00:00
Mike J Innes
f82dbf4798 Merge branch 'master' into HEAD 2017-12-08 17:00:31 +00:00
Mike J Innes
951c21366a fix regex 2017-12-08 16:42:30 +00:00
GenaBitu
7e51418679
Added back for multi-parameter vcat 2017-12-08 16:10:09 +01:00
baggepinnen
385dee9d16 Add jacobian function 2017-12-08 14:46:12 +01:00
GenaBitu
41f3eedc39
Proper multi-variable vcat 2017-12-07 17:50:18 +01:00
Elliot Saba
41446d547f Add weighted_crossentropy for imbalanced classification problems 2017-12-05 17:09:05 -08:00
Elliot Saba
c59b820bed Add glorot (Xavier) initialization
Set default `Dense` and `RNN` inits to `glorot_uniform()` for `W`, `zeros` for `b`.
2017-12-05 14:24:48 -08:00
GenaBitu
62b3600eca
Merge branch 'master' into cat-fix 2017-12-05 11:13:29 +01:00
baggepinnen
41febee9c1 Export and indent 2017-12-04 09:34:27 +01:00
baggepinnen
36001d085a Implement AMSGrad optimiser 2017-12-04 09:17:05 +01:00
Mike J Innes
cab235a578 gpu compat 2017-11-30 13:51:31 +00:00
Mike J Innes
19039f4881 export sigmoid 2017-11-30 13:37:38 +00:00
Mike J Innes
2d33f19346 onehot unk arg 2017-11-29 16:45:50 +00:00
baggepinnen
fa718c7475 Implement Gated Recurrent Unit 2017-11-24 14:33:06 +01:00
CarloLucibello
13b934c250 improve optimizers 2017-11-24 12:12:20 +01:00
Mike J Innes
dc1f08a709
Merge pull request #98 from FluxML/log
GPU-ready log function
2017-11-23 17:17:39 +00:00
Mike J Innes
9f5c4dd3e9
Merge pull request #104 from baggepinnen/patch-1
Allow array of optimisers to train!
2017-11-21 17:16:35 +01:00
Mike J Innes
351d3d4771 std derivative 2017-11-21 17:04:04 +01:00
Mike J Innes
b06884b912 LayerNorm tweaks 2017-11-21 16:32:36 +01:00
skariel
11d53781b2 adding layer normalization 2017-11-21 16:30:24 +01:00
Mike J Innes
979949d01a style 2017-11-21 15:25:09 +01:00
Fredrik Bagge Carlson
8991ce028c
Fix bug in rmsprop and adadelta
`@. p.Δ = η * p.Δ / √acc` parses correctly while `@. p.Δ /= √acc*η` seems to parse like `@. p.Δ /= (√acc*η)`, hence the step size was de facto interpreted as `1/η`
2017-11-14 17:32:16 +01:00
Mike J Innes
e0657d93ec mv numeric.jl to nnlib 2017-11-09 15:06:29 +00:00
Mike J Innes
2cb94981a0 gpu-ready log 2017-11-09 15:04:01 +00:00
Mike J Innes
e5d99d784e fixes #79 2017-11-09 14:53:26 +00:00
Mike J Innes
ccdc046546 fixes #79 2017-11-09 14:52:28 +00:00
Mike J Innes
752a9e2808 tree utilities 2017-11-08 22:19:01 +00:00
Mike J Innes
6eb2ec154b sentiment treebank loader 2017-11-08 22:19:01 +00:00
Mike J Innes
8777362eee exports 2017-11-08 22:19:01 +00:00
Mike J Innes
8b05317895 basic tree 2017-11-08 22:19:01 +00:00
Mike J Innes
7e9468d8f8 treebank skeleton 2017-11-08 22:19:01 +00:00
Mike J Innes
fcd091e8f0 Ac_mul_B derivatives 2017-11-08 22:18:45 +00:00
Mike J Innes
d4229c4815 useful params method 2017-11-08 22:18:45 +00:00
Mike J Innes
d6423eefe5 matrix-vector fast path 2017-11-08 22:18:45 +00:00
Fredrik Bagge Carlson
97244e0a68
Allow array of optimisers to train!
This allows an array of optimisers to be sent to `train!`
2017-11-04 13:27:32 +01:00
Mike J Innes
efa51f02e7 basic batch type 2017-11-02 11:49:42 +00:00
Mike J Innes
21ea93ffcd rename treelike 2017-11-02 11:47:34 +00:00
Iblis Lin
6c7613e02b batchnorm: leverage TrackedArray mean 2017-11-02 14:20:34 +08:00
Iblis Lin
88bd8a8fbd batchnorm: make CuArrays happy 2017-11-02 14:02:41 +08:00
Iblis Lin
477da75428 batchnorm: fix mapchildren 2017-11-02 13:32:12 +08:00
Iblis Lin
5253841acc batchnorm: update docs 2017-11-02 13:32:12 +08:00
Iblis Lin
b3356cc6bb batchnorm: batch σ correct coefficient 2017-11-02 13:32:12 +08:00
Iblis Lin
e0201be770 batchnorm: parameterize momentum and epsilon 2017-11-02 13:32:12 +08:00
Iblis Lin
669273b008 layer: implement BatchNorm layer
See [Batch Normalization: Accelerating Deep Network Training by Reducing
Internal Covariate Shift](https://arxiv.org/pdf/1502.03167.pdf)
2017-11-02 13:32:12 +08:00
Mike J Innes
e7a510da9a add cmudict dataset 2017-11-01 16:01:55 +00:00
Mike J Innes
0f8ba87dc6 treelike tuples 2017-10-31 16:37:41 +00:00
Mike J Innes
e943a39ee7 combine special cases 2017-10-31 16:37:33 +00:00
Iblis Lin
3d8b7250ae add scalar mean 2017-10-31 10:42:32 +00:00
Iblis Lin
c43bda019b TrackedArray: implement mean
```julia
julia> p
Tracked 2×3 Array{Float64,2}:
 1.0  3.0  5.0
 2.0  4.0  6.0
```

Before
```julia
julia> @benchmark Flux.Tracker.back!(sum($p, 2) ./ size($p, 2), ones(2, 1))
BenchmarkTools.Trial:
  memory estimate:  3.44 KiB
  allocs estimate:  75
  --------------
  minimum time:     20.438 μs (0.00% GC)
  median time:      21.239 μs (0.00% GC)
  mean time:        22.354 μs (1.68% GC)
  maximum time:     3.811 ms (98.51% GC)
  --------------
  samples:          10000
  evals/sample:     1
```

After
```julia
julia> @benchmark Flux.Tracker.back!(mean($p, 2), ones(2, 1))
BenchmarkTools.Trial:
  memory estimate:  1008 bytes
  allocs estimate:  21
  --------------
  minimum time:     5.973 μs (0.00% GC)
  median time:      6.310 μs (0.00% GC)
  mean time:        6.630 μs (1.96% GC)
  maximum time:     680.709 μs (97.28% GC)
  --------------
  samples:          10000
  evals/sample:     6
```
2017-10-30 16:21:02 +08:00
Mike J Innes
4c1b1eb18c Merge pull request #92 from CarloLucibello/drop
add Dropout layer
2017-10-26 12:07:28 +01:00
Mike J Innes
84efbbcc84 tracker predicate tweaks 2017-10-26 12:06:29 +01:00
Mike J Innes
cf6b930f63 reorganise 2017-10-26 11:46:12 +01:00
Mike J Innes
0df300299f clearer error message, fixes #93 2017-10-26 11:15:14 +01:00
GenaBitu
df06c3351d
Merge branch 'master' into cat-fix 2017-10-26 00:52:29 +02:00
CarloLucibello
711ea09d99 address comments 2017-10-25 02:35:27 +02:00
CarloLucibello
536ab3861d setmode! -> testmode! 2017-10-23 16:23:29 +02:00
CarloLucibello
00a9e5f01f construct TrackedScalar with params(1) 2017-10-23 10:49:45 +01:00
CarloLucibello
86c7c9246e add == and < for tracked arrays 2017-10-23 11:41:08 +02:00
CarloLucibello
2e1ed4c3fc add dropout 2017-10-23 10:12:53 +02:00
Mike J Innes
2a66545ef8 rnn state reset 2017-10-19 17:21:08 +01:00
Mike J Innes
99a7697d13 adam eta default arg 2017-10-19 14:31:34 +01:00
Mike J Innes
e5c8f6d835 only export known good optimisers 2017-10-19 11:26:11 +01:00
Mike J Innes
5b6a5667ed tracked array restructure 2017-10-18 22:54:58 +01:00
Mike J Innes
c8d4844da4 chunk util 2017-10-18 17:07:58 +01:00
Mike J Innes
07ad7cfa40 learning rate as default arg 2017-10-18 17:07:49 +01:00
Mike J Innes
e82428bb83 batching docs 2017-10-18 16:40:14 +01:00
Mike J Innes
b817ce632c syntax highlighting 2017-10-18 15:44:06 +01:00
Mike J Innes
fd249b773e rnn docs 2017-10-18 15:30:05 +01:00
Mike J Innes
190f48a709 nnlib docs 2017-10-18 14:40:58 +01:00
Mike J Innes
12944ae125 nnlib exports 2017-10-18 12:56:58 +01:00
Mike J Innes
0fbc8dff61 typoe 2017-10-18 12:48:58 +01:00
Mike J Innes
d6dd27dae5 dense layer example 2017-10-18 12:47:45 +01:00
Mike J Innes
7426faf37d optimiser docs 2017-10-18 12:09:48 +01:00
CarloLucibello
041079237e add docsting to train! 2017-10-17 21:04:18 +01:00
CarloLucibello
6d3a2a2210 change argument name for better clarity 2017-10-17 21:04:18 +01:00
Mike J Innes
23674b2555 logitcrossentropy tweaks 2017-10-17 17:58:32 +01:00
pevnak
4aa7741ba9 logit cross entropy 2017-10-17 17:57:46 +01:00
Mike J Innes
6dff8ca8d3 rename crossentropy loss 2017-10-17 17:36:18 +01:00
Mike J Innes
1800c8f523 deprecate mapparams 2017-10-17 17:35:30 +01:00
Mike J Innes
949fd9ba97 loss function tweaks 2017-10-17 17:30:11 +01:00
Mike J Innes
c764b74eba rename and fix mapleaves 2017-10-17 01:08:15 +01:00
Mike J Innes
7aa0b43ceb onehot sanity check 2017-10-17 00:07:58 +01:00
Mike J Innes
e02e320008 more general fmap 2017-10-17 00:07:15 +01:00
Mike J Innes
64e242e96c export param 2017-10-16 08:53:46 +01:00
Mike J Innes
d3db051ca0 flip 2017-10-16 08:53:39 +01:00
Mike J Innes
9a155abecd batch and batchseq apis 2017-10-15 23:44:40 +01:00
Mike J Innes
646720cd05 fix 2017-10-15 23:44:16 +01:00
Mike J Innes
c6556a29e6 order-stable params 2017-10-10 12:16:32 +01:00
GenaBitu
ef6d10886d Exposed all optimisers 2017-10-06 14:20:09 +01:00
GenaBitu
2084df96ae
Merge branch 'master' into cat-fix 2017-10-06 15:00:26 +02:00
pevnak
bfcc1ac25d exposing optimisers 2017-10-05 12:36:18 +01:00
Mike J Innes
1abc4febe6 more general adaptors 2017-10-04 18:55:56 +01:00
Dave Kleinschmidt
2b95aff158 actually use init argument in LSTMCell 2017-10-03 19:26:42 +01:00
Mike J Innes
5fd1b7d9a2 remove gc hack 2017-10-02 20:50:18 +01:00
Mike J Innes
1b91e6b38d store onehotmatrix height 2017-10-02 20:50:11 +01:00
Mike J Innes
7c8dba0b85 gc in training loop 2017-09-27 23:14:58 +01:00
Mike J Innes
a32ae4914c onehotmatrix cuda support 2017-09-27 22:51:00 +01:00
Mike J Innes
a60a754d68 beginnings of gpu support 2017-09-27 21:58:34 +01:00
Mike J Innes
120a6db2bb Merge branch 'master' of github.com:MikeInnes/Flux.jl 2017-09-27 21:16:23 +01:00
Mike J Innes
4bafa2b374 generic tree functions 2017-09-27 21:11:21 +01:00
Mike J Innes
2ec8401d2c remove compiler 2017-09-27 20:48:39 +01:00
Mike J Innes
94e38c05b8 more informative 2017-09-27 18:33:23 +01:00
GenaBitu
136f9bbf74
Hack which doesn't break backprop 2017-09-22 11:47:04 +02:00
GenaBitu
a5fe5b6e65
Added multi-variable vcat for TrackedVector 2017-09-22 11:22:21 +02:00
Mike J Innes
f2052739c1 tweaks 2017-09-12 14:11:03 +01:00
Mike J Innes
a3fe89e348 rnn tanh by default 2017-09-12 13:12:25 +01:00
Mike J Innes
6728295355 Merge pull request #63 from JobJob/rnncell-args
Enables passing an activation function to RNN/RNNCell
2017-09-12 13:10:43 +01:00
Mike J Innes
28bbef81b9 f 2017-09-12 13:06:32 +01:00
Mike J Innes
972ecab9f9 rm Over Seq 2017-09-12 13:03:16 +01:00
Joel Mason
00439555d1 Enables passing an activation function to RNN/RNNCell
Also, fixes it not using the `init` function provided
2017-09-12 20:54:56 +10:00
Mike J Innes
b9652f1812 typo, fixes #61 2017-09-12 10:45:07 +01:00
Mike J Innes
5f24d61ba3 important 2017-09-11 14:10:12 +01:00
Mike J Innes
7041ab9960 rm chainseq 2017-09-11 14:02:43 +01:00
Mike J Innes
c80fb999ff one hot docs 2017-09-11 13:40:11 +01:00
Mike J Innes
3f83be7bb7 more flexible training loop 2017-09-11 13:11:55 +01:00
Mike J Innes
33a5d26e57 chain utility note 2017-09-09 20:02:48 -04:00
Mike J Innes
fedee95b14 docs updates 2017-09-09 19:58:32 -04:00
Mike J Innes
a36d6d2af3 layer docs 2017-09-08 17:52:41 -04:00
Mike J Innes
f55b8cd20e track -> param 2017-09-07 15:13:04 -04:00
Mike J Innes
085d3aa9b4 handle epoch elsewhere 2017-09-07 00:29:55 -04:00
Mike J Innes
aeaa138b6d cb convenience 2017-09-07 00:27:16 -04:00
Mike J Innes
e837bb0745 rnn stuff 2017-09-07 00:27:04 -04:00
Mike J Innes
a93c440c1e style 2017-09-06 23:09:39 -04:00
Mike J Innes
cca4d25a10 efficient traversal 2017-09-06 23:09:32 -04:00
Mike J Innes
7cfc42d166 grad refactor 2017-09-06 21:21:35 -04:00
Mike J Innes
3ef72a9d7b utils updates 2017-09-06 18:59:43 -04:00
Mike J Innes
4083c34547 seq stuff 2017-09-06 18:59:36 -04:00
Mike J Innes
1855a37319 onehot 2017-09-06 18:58:55 -04:00
Mike J Innes
d7e3f7d6e1 fix stack/squeeze usage 2017-09-06 16:02:38 -04:00
Mike J Innes
1946c46e29 basic seq functionality 2017-09-06 14:03:25 -04:00
Mike J Innes
2c8b7bc64b remove these for now 2017-09-06 14:03:12 -04:00
Mike J Innes
4c12d18033 yet another vcat method 2017-09-05 19:25:42 -04:00
Mike J Innes
c95e9376a5 constructors 2017-09-05 19:25:34 -04:00
Mike J Innes
b023da1b7d lstm initialisation 2017-09-05 02:42:32 -04:00
Mike J Innes
61de692b50 lstm nonzero hidden state 2017-09-05 02:37:48 -04:00
Mike J Innes
ec02f1fabd batching in rnns 2017-09-05 02:29:31 -04:00
Mike J Innes
830d7fa611 vcat fix 2017-09-05 02:28:11 -04:00
Mike J Innes
363caeddc6 repmat forward 2017-09-05 02:12:53 -04:00
Mike J Innes
a322c07fc8 vcat back 2017-09-05 02:11:28 -04:00
Mike J Innes
788d7d35f0 better numeric grads 2017-09-03 17:10:35 -04:00
Mike J Innes
8f4ccdd5ba scalar getindex backprop 2017-09-03 17:10:23 -04:00
Mike J Innes
47ba702747 tweak optimiser interface 2017-09-03 17:10:04 -04:00
Mike J Innes
d4211b1f23 sgd export 2017-09-03 17:09:53 -04:00
Mike J Innes
f33a8edd25 meh 2017-09-03 02:45:46 -04:00
Mike J Innes
e57ae77bbb juno progress 2017-09-03 02:44:32 -04:00
Mike J Innes
bd5822fd71 cleaner lstm 2017-09-03 02:24:47 -04:00
Mike J Innes
cf58748680 nicer trackedarray type printing 2017-09-03 02:12:54 -04:00
Mike J Innes
9642ae8cd6 basic recurrence 2017-09-03 02:12:44 -04:00
Mike J Innes
f6771b98cd clearer name for dense 2017-09-02 16:50:11 -04:00
Mike J Innes
fe2b35facc add callbacks back 2017-09-01 23:59:44 -04:00
Mike J Innes
bf098d551c fuck 2017-09-01 23:41:44 -04:00
Mike J Innes
107d9daa8f add some non-differentiable functions 2017-09-01 23:33:05 -04:00
Mike J Innes
387686eb41 optimisers rework 2017-09-01 17:06:51 -04:00
Mike J Innes
892a779ed1 tracked transpose 2017-09-01 11:42:18 -04:00
Mike J Innes
b95dae1868 opt refactor 2017-08-31 14:55:23 -04:00
Mike J Innes
7cd13789dd fix removed import 2017-08-29 17:14:01 -04:00
ylxdzsw
97ecb26003 wip optimisers 2017-08-29 17:00:24 -04:00
Mike J Innes
7bba38274b Merge branch 'master' of github.com:MikeInnes/Flux.jl 2017-08-28 01:41:11 +01:00
Mike J Innes
0b89e1374c gpu-friendly 2017-08-28 01:40:59 +01:00
Mike J Innes
73166c52a0 cleaner broadcasting fix 2017-08-27 09:49:42 +01:00
Mike J Innes
12dc6b66c5 whoops 2017-08-24 22:23:05 +01:00
Mike J Innes
52f5f4a4c0 initial cuarrays integration 2017-08-24 17:00:48 +01:00
Mike J Innes
e7f26370d7 training tweaks 2017-08-24 16:10:04 +01:00
Mike J Innes
1526b13691 basic training loop 2017-08-24 11:42:29 +01:00
Mike J Innes
9ce0439943 better mse 2017-08-24 11:40:51 +01:00
Mike J Innes
23690e0083 not useful enough 2017-08-24 11:40:19 +01:00
Mike J Innes
d162e028bb utility method 2017-08-23 17:50:49 +01:00
Mike J Innes
e4e9794f5e loss function gradients 2017-08-23 17:50:43 +01:00
Mike J Innes
60c3090981 broadcasting fix 2017-08-23 17:21:02 +01:00
Mike J Innes
23c5a1b163 softmax gradient 2017-08-23 02:03:17 +01:00
Mike J Innes
5eee653a64 gradient checks 2017-08-23 01:43:45 +01:00
Mike J Innes
56ed6f5680 de-broadcasting 2017-08-23 00:25:19 +01:00
Mike J Innes
bafecfede1 sgd 2017-08-22 22:25:18 +01:00
Mike J Innes
f2dd7b0e90 fix include case 2017-08-22 17:18:27 +01:00
Mike J Innes
0ce8c0cee4 param collection 2017-08-22 17:13:03 +01:00
Mike J Innes
1179269355 remove old params 2017-08-22 15:21:08 +01:00