Commit Graph

995 Commits

Author SHA1 Message Date
Mike J Innes
23096824d5 import jacobian 2017-12-13 17:29:32 +00:00
Mike J Innes
9c7c9d2342
Merge pull request #124 from baggepinnen/jacobian
Add jacobian function
2017-12-13 17:07:57 +00:00
Mike J Innes
95d1287455 Merge branch 'master' into jacobian 2017-12-13 17:06:23 +00:00
Mike J Innes
27d896943e
Merge pull request #120 from staticfloat/sf/dense_initialization
Better default initialization for Dense layers
2017-12-13 16:18:02 +00:00
Mike J Innes
9926202408
Merge pull request #122 from staticfloat/sf/weighted_crossentropy
Add `weighted_crossentropy` for imbalanced classification problems
2017-12-13 15:29:54 +00:00
Mike J Innes
e3a688e706 use kwarg 2017-12-13 15:27:15 +00:00
Mike J Innes
128725cefd Merge branch 'master' into sf/weighted_crossentropy 2017-12-13 15:14:47 +00:00
Mike J Innes
29787eba45 fixes #114 2017-12-12 17:23:15 +00:00
Mike J Innes
b7b6c975bc fixes #110 2017-12-12 17:07:39 +00:00
Mike J Innes
86097e76fd tweak batchnorm example 2017-12-08 19:34:34 +00:00
Mike J Innes
de69d23901
Merge pull request #84 from iblis17/norm-layer
layer: implement BatchNorm layer
2017-12-08 19:32:55 +00:00
Mike J Innes
6f997e798a Merge branch 'master' into batchnorm 2017-12-08 19:31:50 +00:00
Mike J Innes
e01c706e71
Merge pull request #119 from baggepinnen/amsgrad
Amsgrad
2017-12-08 18:24:54 +00:00
Mike J Innes
55bbe50f32 regression test 2017-12-08 18:24:07 +00:00
Mike J Innes
24a6569589 Merge branch 'master' into amsgrad 2017-12-08 18:20:53 +00:00
Mike J Innes
9c61cf61ef
Merge pull request #94 from CarloLucibello/dropout
improve optimizers
2017-12-08 17:14:54 +00:00
Mike J Innes
69cc5642b4 regression testing 2017-12-08 17:10:29 +00:00
Mike J Innes
f82dbf4798 Merge branch 'master' into HEAD 2017-12-08 17:00:31 +00:00
Mike J Innes
951c21366a fix regex 2017-12-08 16:42:30 +00:00
baggepinnen
385dee9d16 Add jacobian function 2017-12-08 14:46:12 +01:00
Elliot Saba
41446d547f Add weighted_crossentropy for imbalanced classification problems 2017-12-05 17:09:05 -08:00
Elliot Saba
c59b820bed Add glorot (Xavier) initialization
Set default `Dense` and `RNN` inits to `glorot_uniform()` for `W`, `zeros` for `b`.
2017-12-05 14:24:48 -08:00
baggepinnen
41febee9c1 Export and indent 2017-12-04 09:34:27 +01:00
baggepinnen
36001d085a Implement AMSGrad optimiser 2017-12-04 09:17:05 +01:00
Mike J Innes
cab235a578 gpu compat 2017-11-30 13:51:31 +00:00
Mike J Innes
19039f4881 export sigmoid 2017-11-30 13:37:38 +00:00
Mike J Innes
2d33f19346 onehot unk arg 2017-11-29 16:45:50 +00:00
CarloLucibello
13b934c250 improve optimizers 2017-11-24 12:12:20 +01:00
Mike J Innes
dc1f08a709
Merge pull request #98 from FluxML/log
GPU-ready log function
2017-11-23 17:17:39 +00:00
Mike J Innes
9f5c4dd3e9
Merge pull request #104 from baggepinnen/patch-1
Allow array of optimisers to train!
2017-11-21 17:16:35 +01:00
Mike J Innes
feb35783e6
Merge pull request #95 from FluxML/layernorm
Layer Normalisation
2017-11-21 17:12:49 +01:00
Mike J Innes
351d3d4771 std derivative 2017-11-21 17:04:04 +01:00
Mike J Innes
b06884b912 LayerNorm tweaks 2017-11-21 16:32:36 +01:00
skariel
11d53781b2 adding layer normalization 2017-11-21 16:30:24 +01:00
Mike J Innes
979949d01a style 2017-11-21 15:25:09 +01:00
Mike J Innes
785fbcf68e
Merge pull request #107 from baggepinnen/patch-2
Fix bug in rmsprop and adadelta
2017-11-21 15:24:11 +01:00
Mike J Innes
e51268caf5 mention treelike 2017-11-21 12:59:39 +01:00
Mike J Innes
187fddc11c doc fixes 2017-11-21 12:29:02 +01:00
Fredrik Bagge Carlson
8991ce028c
Fix bug in rmsprop and adadelta
`@. p.Δ = η * p.Δ / √acc` parses correctly while `@. p.Δ /= √acc*η` seems to parse like `@. p.Δ /= (√acc*η)`, hence the step size was de facto interpreted as `1/η`
2017-11-14 17:32:16 +01:00
Mike J Innes
e0657d93ec mv numeric.jl to nnlib 2017-11-09 15:06:29 +00:00
Mike J Innes
2cb94981a0 gpu-ready log 2017-11-09 15:04:01 +00:00
Mike J Innes
e5d99d784e fixes #79 2017-11-09 14:53:26 +00:00
Mike J Innes
bdf02e42ae test tweaks 2017-11-08 22:18:45 +00:00
Mike J Innes
fcd091e8f0 Ac_mul_B derivatives 2017-11-08 22:18:45 +00:00
Mike J Innes
d4229c4815 useful params method 2017-11-08 22:18:45 +00:00
Mike J Innes
d6423eefe5 matrix-vector fast path 2017-11-08 22:18:45 +00:00
Fredrik Bagge Carlson
97244e0a68
Allow array of optimisers to train!
This allows an array of optimisers to be sent to `train!`
2017-11-04 13:27:32 +01:00
Mike J Innes
efa51f02e7 basic batch type 2017-11-02 11:49:42 +00:00
Mike J Innes
21ea93ffcd rename treelike 2017-11-02 11:47:34 +00:00
Iblis Lin
6c7613e02b batchnorm: leverage TrackedArray mean 2017-11-02 14:20:34 +08:00