Mike J Innes
|
e04dcbd460
|
reorganise recurrent stuff
|
2017-05-04 10:45:44 +01:00 |
|
Mike J Innes
|
36f4103d7d
|
cost is really a layer
|
2017-05-01 12:30:28 +01:00 |
|
Mike J Innes
|
14afe54143
|
fixes for recurrent networks
|
2017-04-19 17:17:37 +01:00 |
|
Mike J Innes
|
90edefe072
|
use broadcasting plus
|
2017-03-21 01:18:00 +00:00 |
|
Mike J Innes
|
eadf96605c
|
fix recurrent layer batching
|
2017-03-20 23:11:08 +00:00 |
|
Mike J Innes
|
6756ce7528
|
relu inferrence
|
2017-03-20 19:57:11 +00:00 |
|
Mike J Innes
|
d73e962da9
|
replace old shape inference system
|
2017-03-17 16:34:54 +00:00 |
|
Mike J Innes
|
2adc3cd18e
|
new struct syntax
|
2017-03-14 17:56:03 +00:00 |
|
Mike J Innes
|
6751657dfa
|
typealias / abstract deps
|
2017-03-14 16:51:31 +00:00 |
|
Mike J Innes
|
c4d815b5fc
|
move todo
|
2017-03-09 00:12:49 +00:00 |
|
Mike J Innes
|
6a3bed1e61
|
this is pretty useless
|
2017-03-08 15:38:55 +00:00 |
|
Mike J Innes
|
06f2ee2284
|
iterate over chain
|
2017-03-07 14:37:37 +00:00 |
|
Mike J Innes
|
2a57150bce
|
AvgPool shim
|
2017-03-06 17:21:35 +00:00 |
|
Mike J Innes
|
4d4979b401
|
better alternative to basemodel
|
2017-02-28 16:42:48 +00:00 |
|
Mike J Innes
|
5f1f2ebaa2
|
model storage notes
|
2017-02-28 16:41:33 +00:00 |
|
Mike J Innes
|
3fdffea37d
|
fix
|
2017-02-20 21:50:01 +00:00 |
|
Mike J Innes
|
d7fe525f4d
|
fix the build
|
2017-01-16 01:21:45 +01:00 |
|
Mike J Innes
|
c82716b535
|
move activations
|
2016-12-15 23:24:10 +00:00 |
|
Mike J Innes
|
6114b70f76
|
use regular +
|
2016-12-15 22:57:36 +00:00 |
|
Mike J Innes
|
62fd13bded
|
consistently use delta for gradients
|
2016-12-15 21:37:39 +00:00 |
|
Mike J Innes
|
4b64bf11a5
|
fix lstm
|
2016-12-15 20:53:08 +00:00 |
|
Mike J Innes
|
f31b539566
|
make these fit with julia semantics
|
2016-11-15 16:40:17 +00:00 |
|
Mike J Innes
|
9062792495
|
rename affine file
|
2016-11-15 00:09:53 +00:00 |
|
Mike J Innes
|
bdd05157e2
|
dense -> affine
|
2016-11-14 22:16:00 +00:00 |
|
Mike J Innes
|
e4a6ca5f9e
|
remove custom show
|
2016-11-13 15:35:20 +00:00 |
|
Mike J Innes
|
ad6e6b4116
|
update recurrent usage
|
2016-11-08 18:08:13 +00:00 |
|
Mike J Innes
|
d7d95feab8
|
actually get GRU working
|
2016-11-02 00:36:13 +00:00 |
|
Mike J Innes
|
85415d4244
|
throw GRU together
|
2016-11-01 14:42:41 +00:00 |
|
Mike J Innes
|
7cd94b4a5d
|
well, that was easy 😎
|
2016-10-31 11:01:19 +00:00 |
|
Mike J Innes
|
508364407e
|
simplify recurrent layer
|
2016-10-30 16:07:18 +00:00 |
|
Mike J Innes
|
81d9743836
|
export recurrent
|
2016-10-30 11:41:52 +00:00 |
|
Mike J Innes
|
89c4a6df31
|
this is no longer test code
|
2016-10-29 00:13:32 +01:00 |
|
Mike J Innes
|
eb78f67a93
|
refactor input model
|
2016-10-25 23:10:35 +01:00 |
|
Mike J Innes
|
b115d8ce3f
|
model -> net
|
2016-10-12 16:28:16 +01:00 |
|
Mike J Innes
|
a56af5d16e
|
reshape layer
|
2016-10-10 23:48:16 +01:00 |
|
Mike J Innes
|
438dc9d40a
|
fix conv2d shape inference
|
2016-10-10 23:20:40 +01:00 |
|
Mike J Innes
|
bf04b70ad1
|
Float32 by default
|
2016-10-04 22:36:56 +01:00 |
|
Mike J Innes
|
cc1ca4c3c2
|
Conv2D tweaks
|
2016-10-04 22:23:26 +01:00 |
|
Mike J Innes
|
2609d47ce9
|
work more nicely with TF batching
|
2016-10-04 21:10:50 +01:00 |
|
Mike J Innes
|
8961b4c10f
|
basic convnet example working
|
2016-09-06 18:11:15 +01:00 |
|
Mike J Innes
|
6503496c39
|
improve printing
|
2016-09-06 18:11:14 +01:00 |
|
Mike J Innes
|
d58fefb972
|
tweak note
|
2016-09-06 18:11:14 +01:00 |
|
Mike J Innes
|
19b5e8bd21
|
loop lifting
|
2016-09-06 18:11:14 +01:00 |
|
Mike J Innes
|
c92cff5dce
|
a bunch of stuff
|
2016-09-06 18:11:05 +01:00 |
|
Mike J Innes
|
afac5d8bfe
|
better default init
|
2016-09-06 18:10:21 +01:00 |
|
Mike J Innes
|
fd67383494
|
don't print reams of data
|
2016-09-06 18:10:21 +01:00 |
|
Mike J Innes
|
2635283bf1
|
small reorg
|
2016-09-06 18:10:20 +01:00 |
|
Mike J Innes
|
b8565a4cc3
|
update api
|
2016-09-06 18:10:20 +01:00 |
|
Mike J Innes
|
6808a92793
|
anonymous models
|
2016-09-06 18:10:20 +01:00 |
|
Mike J Innes
|
545d4480ed
|
tweaks
|
2016-09-06 18:10:20 +01:00 |
|