Mike J Innes
|
73ae25289d
|
remove old util
|
2017-12-15 16:18:01 +00:00 |
|
Mike J Innes
|
6890a61587
|
todo
|
2017-12-15 16:17:45 +00:00 |
|
Mike J Innes
|
9b833a4345
|
more onehot indexing
|
2017-12-15 16:17:39 +00:00 |
|
Mike J Innes
|
9d0dd9fb7e
|
layer wip
|
2017-12-15 13:22:57 +00:00 |
|
Mike J Innes
|
0bf22dfb8e
|
pool gradients
|
2017-12-15 02:29:14 +00:00 |
|
Mike J Innes
|
d949b31aa5
|
conv gradient
|
2017-12-15 02:24:32 +00:00 |
|
Mike J Innes
|
5b97d2ba04
|
closes #127
|
2017-12-13 18:24:56 +00:00 |
|
Mike J Innes
|
23096824d5
|
import jacobian
|
2017-12-13 17:29:32 +00:00 |
|
Mike J Innes
|
9c7c9d2342
|
Merge pull request #124 from baggepinnen/jacobian
Add jacobian function
|
2017-12-13 17:07:57 +00:00 |
|
Mike J Innes
|
95d1287455
|
Merge branch 'master' into jacobian
|
2017-12-13 17:06:23 +00:00 |
|
Mike J Innes
|
27d896943e
|
Merge pull request #120 from staticfloat/sf/dense_initialization
Better default initialization for Dense layers
|
2017-12-13 16:18:02 +00:00 |
|
Mike J Innes
|
9926202408
|
Merge pull request #122 from staticfloat/sf/weighted_crossentropy
Add `weighted_crossentropy` for imbalanced classification problems
|
2017-12-13 15:29:54 +00:00 |
|
Mike J Innes
|
e3a688e706
|
use kwarg
|
2017-12-13 15:27:15 +00:00 |
|
Mike J Innes
|
128725cefd
|
Merge branch 'master' into sf/weighted_crossentropy
|
2017-12-13 15:14:47 +00:00 |
|
Mike J Innes
|
29787eba45
|
fixes #114
|
2017-12-12 17:23:15 +00:00 |
|
Mike J Innes
|
b7b6c975bc
|
fixes #110
|
2017-12-12 17:07:39 +00:00 |
|
Mike J Innes
|
403cc26327
|
Merge branch 'master' into gru
|
2017-12-12 16:54:00 +00:00 |
|
Mike J Innes
|
86097e76fd
|
tweak batchnorm example
|
2017-12-08 19:34:34 +00:00 |
|
Mike J Innes
|
de69d23901
|
Merge pull request #84 from iblis17/norm-layer
layer: implement BatchNorm layer
|
2017-12-08 19:32:55 +00:00 |
|
Mike J Innes
|
6f997e798a
|
Merge branch 'master' into batchnorm
|
2017-12-08 19:31:50 +00:00 |
|
Mike J Innes
|
1d916c81b5
|
Merge branch 'master' into HEAD
|
2017-12-08 18:31:55 +00:00 |
|
Mike J Innes
|
e01c706e71
|
Merge pull request #119 from baggepinnen/amsgrad
Amsgrad
|
2017-12-08 18:24:54 +00:00 |
|
Mike J Innes
|
55bbe50f32
|
regression test
|
2017-12-08 18:24:07 +00:00 |
|
Mike J Innes
|
24a6569589
|
Merge branch 'master' into amsgrad
|
2017-12-08 18:20:53 +00:00 |
|
Mike J Innes
|
9c61cf61ef
|
Merge pull request #94 from CarloLucibello/dropout
improve optimizers
|
2017-12-08 17:14:54 +00:00 |
|
Mike J Innes
|
69cc5642b4
|
regression testing
|
2017-12-08 17:10:29 +00:00 |
|
Mike J Innes
|
f82dbf4798
|
Merge branch 'master' into HEAD
|
2017-12-08 17:00:31 +00:00 |
|
Mike J Innes
|
951c21366a
|
fix regex
|
2017-12-08 16:42:30 +00:00 |
|
GenaBitu
|
7e51418679
|
Added back for multi-parameter vcat
|
2017-12-08 16:10:09 +01:00 |
|
baggepinnen
|
385dee9d16
|
Add jacobian function
|
2017-12-08 14:46:12 +01:00 |
|
GenaBitu
|
41f3eedc39
|
Proper multi-variable vcat
|
2017-12-07 17:50:18 +01:00 |
|
Elliot Saba
|
41446d547f
|
Add weighted_crossentropy for imbalanced classification problems
|
2017-12-05 17:09:05 -08:00 |
|
Elliot Saba
|
c59b820bed
|
Add glorot (Xavier) initialization
Set default `Dense` and `RNN` inits to `glorot_uniform()` for `W`, `zeros` for `b`.
|
2017-12-05 14:24:48 -08:00 |
|
GenaBitu
|
62b3600eca
|
Merge branch 'master' into cat-fix
|
2017-12-05 11:13:29 +01:00 |
|
baggepinnen
|
41febee9c1
|
Export and indent
|
2017-12-04 09:34:27 +01:00 |
|
baggepinnen
|
36001d085a
|
Implement AMSGrad optimiser
|
2017-12-04 09:17:05 +01:00 |
|
Mike J Innes
|
cab235a578
|
gpu compat
|
2017-11-30 13:51:31 +00:00 |
|
Mike J Innes
|
19039f4881
|
export sigmoid
|
2017-11-30 13:37:38 +00:00 |
|
Mike J Innes
|
2d33f19346
|
onehot unk arg
|
2017-11-29 16:45:50 +00:00 |
|
baggepinnen
|
fa718c7475
|
Implement Gated Recurrent Unit
|
2017-11-24 14:33:06 +01:00 |
|
CarloLucibello
|
13b934c250
|
improve optimizers
|
2017-11-24 12:12:20 +01:00 |
|
Mike J Innes
|
dc1f08a709
|
Merge pull request #98 from FluxML/log
GPU-ready log function
|
2017-11-23 17:17:39 +00:00 |
|
Mike J Innes
|
9f5c4dd3e9
|
Merge pull request #104 from baggepinnen/patch-1
Allow array of optimisers to train!
|
2017-11-21 17:16:35 +01:00 |
|
Mike J Innes
|
feb35783e6
|
Merge pull request #95 from FluxML/layernorm
Layer Normalisation
|
2017-11-21 17:12:49 +01:00 |
|
Mike J Innes
|
351d3d4771
|
std derivative
|
2017-11-21 17:04:04 +01:00 |
|
Mike J Innes
|
b06884b912
|
LayerNorm tweaks
|
2017-11-21 16:32:36 +01:00 |
|
skariel
|
11d53781b2
|
adding layer normalization
|
2017-11-21 16:30:24 +01:00 |
|
Mike J Innes
|
979949d01a
|
style
|
2017-11-21 15:25:09 +01:00 |
|
Mike J Innes
|
785fbcf68e
|
Merge pull request #107 from baggepinnen/patch-2
Fix bug in rmsprop and adadelta
|
2017-11-21 15:24:11 +01:00 |
|
Mike J Innes
|
e51268caf5
|
mention treelike
|
2017-11-21 12:59:39 +01:00 |
|