Commit Graph

58 Commits

Author SHA1 Message Date
Robert Luciani
252e34e173 1.0+ updates - indices to axes, Vector init with undef 2018-10-02 21:39:00 +02:00
pevnak
3510c837a8 zeros replaced by zero 2018-08-03 15:14:25 +01:00
pevnak
e98538673a updated sum to be compliant with latest beta. Removed some depwarns 2018-08-03 15:14:25 +01:00
Mike J Innes
88a265154c deprecations 2018-08-03 12:54:31 +01:00
Mike Innes
5153cde847 move epochs 2018-03-05 22:56:22 +00:00
Mike J Innes
7606b1a399 single-batch convolution 2018-02-28 14:25:32 +00:00
Mike J Innes
e5791bc5f6 frequencies utility 2018-02-17 11:19:51 +00:00
Mike J Innes
5e861101f3 epochs util 2018-02-16 11:17:57 +00:00
Mike J Innes
066cb45a38 remove old accuracy fn 2018-02-13 11:12:21 +00:00
Mike J Innes
73ae25289d remove old util 2017-12-15 16:18:01 +00:00
Mike J Innes
5b97d2ba04 closes #127 2017-12-13 18:24:56 +00:00
Mike J Innes
95d1287455 Merge branch 'master' into jacobian 2017-12-13 17:06:23 +00:00
baggepinnen
385dee9d16 Add jacobian function 2017-12-08 14:46:12 +01:00
Elliot Saba
c59b820bed Add glorot (Xavier) initialization
Set default `Dense` and `RNN` inits to `glorot_uniform()` for `W`, `zeros` for `b`.
2017-12-05 14:24:48 -08:00
Mike J Innes
c8d4844da4 chunk util 2017-10-18 17:07:58 +01:00
Mike J Innes
e82428bb83 batching docs 2017-10-18 16:40:14 +01:00
Mike J Innes
9a155abecd batch and batchseq apis 2017-10-15 23:44:40 +01:00
Mike J Innes
c80fb999ff one hot docs 2017-09-11 13:40:11 +01:00
Mike J Innes
1855a37319 onehot 2017-09-06 18:58:55 -04:00
Mike J Innes
2c8b7bc64b remove these for now 2017-09-06 14:03:12 -04:00
Mike J Innes
f33a8edd25 meh 2017-09-03 02:45:46 -04:00
Mike J Innes
e7f26370d7 training tweaks 2017-08-24 16:10:04 +01:00
Mike J Innes
4a9dc40e7c simplify organisation 2017-08-19 20:52:29 +01:00
Mike J Innes
bd6bffde48 silo the compiler 2017-08-19 20:04:21 +01:00
Mike J Innes
e79a1657d4 remove batching and training 2017-08-18 01:19:06 +01:00
Mike J Innes
ddcd576a74 give up and use AbstractArray 2017-06-05 16:09:06 +01:00
Mike J Innes
4685d2e672 strip down non-obvious exports 2017-06-05 15:47:26 +01:00
Mike J Innes
215e997540 broadcastto 2017-06-02 15:02:30 +01:00
Mike J Innes
b54281bdea hadamard product 2017-06-01 19:27:46 +01:00
Tony Kelman
41ea071f3a Use a package-local squeeze function instead of extending Base 2017-05-22 04:08:46 -04:00
Tony Kelman
5cbb47a13d Don't extend base functions on base types
better broadcast syntax
2017-05-22 04:05:57 -04:00
Mike J Innes
c025cddc73 runmodel no longer needed 2017-05-04 10:32:53 +01:00
Mike J Innes
357f989de5 pull out tuple utils 2017-05-01 16:57:51 +01:00
Mike J Innes
6778d00dbf this is no longer specific to training 2017-05-01 13:46:23 +01:00
Mike J Innes
38852964f6 organise training and utils 2017-05-01 12:41:54 +01:00
Mike J Innes
ef4ec5be4b customisable loss 2017-04-28 17:14:21 +01:00
Mike J Innes
63b328142a print epochs again 2017-04-27 17:43:38 +01:00
Mike J Innes
edfb0211e6 better for nested batches 2017-04-19 17:18:40 +01:00
Mike J Innes
42688f8aa8 update training process, mnist example 2017-04-19 14:23:48 +01:00
Mike J Innes
5357b1e9f9 remove fake batching vestiges 2017-04-19 13:19:18 +01:00
Mike J Innes
f3a9934858 update mnist example 2017-02-02 10:09:41 +05:30
Mike J Innes
a71c79e920 convert parameters also 2017-01-27 16:02:52 +05:30
Mike J Innes
62fd13bded consistently use delta for gradients 2016-12-15 21:37:39 +00:00
Mike J Innes
4517e41226 sampling + tweaks 2016-10-30 16:07:29 +00:00
Mike J Innes
73ff5b4201 batched training for char-rnn 2016-10-29 23:36:39 +01:00
Mike J Innes
d9abb8f0ce chunks util 2016-10-28 21:47:57 +01:00
Mike J Innes
d442dd8c5b use Float32 here 2016-10-25 16:23:04 +01:00
Mike J Innes
bf04b70ad1 Float32 by default 2016-10-04 22:36:56 +01:00
Mike J Innes
a2aade718d get basic training working 2016-09-29 20:50:43 +01:00
Mike J Innes
62ede8cd80 use Juno progress bar 2016-09-06 18:37:39 +01:00