Mike J Innes
|
5f27e30e68
|
basic line node handling
|
2016-12-20 15:44:00 +00:00 |
|
Mike J Innes
|
42ce2fadf1
|
don't do + twice
|
2016-12-15 23:08:56 +00:00 |
|
Mike J Innes
|
6114b70f76
|
use regular +
|
2016-12-15 22:57:36 +00:00 |
|
Mike J Innes
|
a330b394bd
|
move batchone util
|
2016-12-15 21:37:07 +00:00 |
|
Mike J Innes
|
03840d043c
|
fix ops
|
2016-12-15 20:53:15 +00:00 |
|
Mike J Innes
|
c6fb9c1f0c
|
fix model call
|
2016-12-15 18:35:11 +00:00 |
|
Mike J Innes
|
1b22d55401
|
fix param interpretation
|
2016-12-13 15:46:34 +00:00 |
|
Mike J Innes
|
2aa8dfc208
|
tweak constants approach
|
2016-11-17 11:28:24 +00:00 |
|
Mike J Innes
|
1424b75e78
|
extra ops
|
2016-11-17 11:28:15 +00:00 |
|
Mike J Innes
|
a6169ec2d0
|
stateless model support
|
2016-11-15 23:54:08 +00:00 |
|
Mike J Innes
|
2d90d04789
|
explicit hidden state batching
|
2016-11-15 23:44:11 +00:00 |
|
Mike J Innes
|
3c068744d2
|
get rid of Group
|
2016-11-15 21:09:58 +00:00 |
|
Mike J Innes
|
c654fe403a
|
move towards abstract interpreter model
|
2016-11-13 20:27:20 +00:00 |
|
Mike J Innes
|
6ac4dd8429
|
move op
|
2016-11-13 18:16:58 +00:00 |
|
Mike J Innes
|
c3d32c395c
|
new progress integration
|
2016-11-04 22:28:12 +00:00 |
|
Mike J Innes
|
d7d95feab8
|
actually get GRU working
|
2016-11-02 00:36:13 +00:00 |
|
Mike J Innes
|
53ebb5051a
|
Flow -> DataFlow
|
2016-10-31 12:38:18 +00:00 |
|
Mike J Innes
|
a6fe1f3810
|
use new session and store params back in the model
|
2016-10-30 15:08:50 +00:00 |
|
Mike J Innes
|
b443425c6d
|
cross entropy loss, loss checks
|
2016-10-30 14:12:03 +00:00 |
|
Mike J Innes
|
3b70ea6a42
|
split out makesession
|
2016-10-30 12:29:00 +00:00 |
|
Mike J Innes
|
e433ffce8f
|
split out makesession logic
|
2016-10-30 12:10:44 +00:00 |
|
Mike J Innes
|
a99bb03830
|
gradients are slow
|
2016-10-30 10:55:07 +00:00 |
|
Mike J Innes
|
ec1950b466
|
.* in tf
|
2016-10-30 10:54:55 +00:00 |
|
Mike J Innes
|
a1b1d87767
|
update states references
|
2016-10-30 01:58:39 +01:00 |
|
Mike J Innes
|
1761e43bc4
|
handle state in training
|
2016-10-30 00:24:29 +01:00 |
|
Mike J Innes
|
605e3a9363
|
don't rebatch batches
|
2016-10-30 00:20:15 +01:00 |
|
Mike J Innes
|
73ff5b4201
|
batched training for char-rnn
|
2016-10-29 23:36:39 +01:00 |
|
Mike J Innes
|
4de16171db
|
basic sequence model training
|
2016-10-29 00:10:27 +01:00 |
|
Mike J Innes
|
d9ed5676c2
|
handle state on julia side
|
2016-10-28 21:17:48 +01:00 |
|
Mike J Innes
|
e450a585b7
|
handling of multiple outputs
|
2016-10-28 20:50:27 +01:00 |
|
Mike J Innes
|
1c6eaece5d
|
rename seqmodel
|
2016-10-28 19:11:38 +01:00 |
|
Mike J Innes
|
102e09d14b
|
tf recurrent models
|
2016-10-28 17:14:57 +01:00 |
|
Mike J Innes
|
217e28653a
|
tf cycle conversion error
|
2016-10-28 17:12:19 +01:00 |
|
Mike J Innes
|
c5a64391a1
|
use batching api
|
2016-10-28 17:00:31 +01:00 |
|
Mike J Innes
|
2852dddf0f
|
put this back
|
2016-10-28 16:26:06 +01:00 |
|
Mike J Innes
|
8140c2312d
|
todone
|
2016-10-28 16:25:59 +01:00 |
|
Mike J Innes
|
d6eacf3375
|
better handling for reused params
|
2016-10-28 16:06:56 +01:00 |
|
Mike J Innes
|
740d868ef9
|
tf.model refactor
|
2016-10-28 15:13:58 +01:00 |
|
Mike J Innes
|
27aa2bf8d4
|
graph support
|
2016-10-28 15:13:43 +01:00 |
|
Mike J Innes
|
0ad569596b
|
tf reorg
|
2016-10-26 14:25:10 +01:00 |
|
Mike J Innes
|
eb78f67a93
|
refactor input model
|
2016-10-25 23:10:35 +01:00 |
|
Mike J Innes
|
91a62a04bc
|
update for flow exports
|
2016-10-25 21:32:51 +01:00 |
|
Mike J Innes
|
dea85df8b7
|
use param object rather than named input
|
2016-10-25 17:57:20 +01:00 |
|
Mike J Innes
|
a06145a145
|
use new batching approach in TensorFlow
|
2016-10-25 16:21:17 +01:00 |
|
Mike J Innes
|
c9f9665e4e
|
move batching logic
|
2016-10-12 17:07:22 +01:00 |
|
Mike J Innes
|
a56af5d16e
|
reshape layer
|
2016-10-10 23:48:16 +01:00 |
|
Mike J Innes
|
4961bf72af
|
updates
|
2016-10-10 23:04:26 +01:00 |
|
Mike J Innes
|
45d30312b6
|
tf flatten
|
2016-10-04 22:50:20 +01:00 |
|
Mike J Innes
|
bf04b70ad1
|
Float32 by default
|
2016-10-04 22:36:56 +01:00 |
|
Mike J Innes
|
9e9c57d49b
|
more TF support
|
2016-10-04 22:23:37 +01:00 |
|