Commit Graph

1569 Commits

Author SHA1 Message Date
Mike J Innes
43c5f90d93
Merge pull request #379 from dhairyagandhi96/master
New optimisers interface
2018-10-31 16:38:40 +00:00
Mike J Innes
46049b9f44 tweak update rule 2018-10-31 16:08:18 +00:00
Mike J Innes
554c4c7c7a return Params from params 2018-10-31 15:50:08 +00:00
Mike J Innes
4a54d30cbf correct SGD deprecation 2018-10-31 15:30:30 +00:00
Mike J Innes
bffaceee02 tweaks 2018-10-31 14:58:55 +00:00
Mike J Innes
70283e1971
Merge pull request #465 from FluxML/mji/once
Destroy AD graph when doing in-place gradients
2018-10-31 14:14:38 +00:00
Mike J Innes
9312536b96
Merge pull request #461 from Roger-luo/roger-patch-1
Support view for TrackedArray
2018-10-30 15:24:05 +00:00
Mike J Innes
77178b7d67 remove old-style definition and test 2018-10-30 14:21:22 +00:00
Avik Pal
7804d980b2
Update cudnn.jl 2018-10-30 01:08:21 +05:30
Dhairya Gandhi
bebf4eb95f fixed ExpDecay update! rule 2018-10-29 23:12:24 +05:30
Keno Fischer
baf868e851
Add VERSION check around broadcast piracy 2018-10-28 16:07:26 -04:00
Dhairya Gandhi
32ce2d78b8 fixed ExpDecay test 2018-10-27 19:53:06 +05:30
Dhairya Gandhi
815e8c206d decay fixes 2018-10-27 19:26:42 +05:30
Mike J Innes
b77433cdfd 0.7 fix 2018-10-27 12:23:14 +01:00
Eric Davies
9f9803eec6 Add new-style diagm to tracker 2018-10-26 14:44:59 -05:00
Roger-luo
e5d58699e6 fix and add test 2018-10-26 14:06:17 -04:00
Mike J Innes
c21d768b7c destroy AD graph when doing in-place gradients 2018-10-26 16:57:19 +01:00
Tejan Karmali
a657c287d0 in accordance with conv_filter api 2018-10-26 11:31:34 -04:00
Avik Pal
b838c0bc04 Update the libcudnn_handle 2018-10-26 10:24:30 +05:30
Roger-luo
a3cda9016c apply Mike's change 2018-10-25 13:48:33 -04:00
Roger-luo
5f99e5775a fix #458 2018-10-24 15:40:10 -04:00
Tejan Karmali
387df8c095 conv_filter api changes 2018-10-24 13:28:22 -04:00
Tejan Karmali
fca93471b3 in accordance with conv_data api 2018-10-24 12:52:43 -04:00
Avik Pal
ec2c00783d
Add missing export for DepthwiseConv 2018-10-24 22:18:26 +05:30
Tejan Karmali
0dc4ec4d6b conv_data grad api change 2018-10-24 07:04:49 -04:00
Tejan Karmali
f540a0daf7 merge with upstream 2018-10-23 13:40:06 -04:00
Avik Pal
2559e7b4e6 Fix merge conflicts 2018-10-23 21:53:29 +05:30
Mike J Innes
bbccdb3eec
Merge pull request #279 from avik-pal/depthwiseconv
Adds support for Depthwise Convolutions
2018-10-23 17:22:15 +01:00
Mike J Innes
96dbae2d20 Omega and Turing fix 2018-10-23 11:30:37 +01:00
Tejan Karmali
e9bf86dbff Merge branch 'master' of https://github.com/FluxML/Flux.jl into conv_transpose 2018-10-19 02:08:25 -04:00
Sebastian Stabinger
94e5e9f993 Removes initn initialization
Is replaced with glorot_uniform for Conv following Keras
2018-10-17 17:11:16 +02:00
Avik Pal
3899907164
Update conv.jl 2018-10-11 21:39:35 +05:30
Dhairya Gandhi
1f0f2a5ac2 fixed DescentWeightDecay parameters 2018-10-11 10:21:29 +05:30
Dhairya Gandhi
d8394298bb fix merge conflicts 2018-10-11 10:15:59 +05:30
Dhairya Gandhi
fe8c147f72 fixed weight decay definition 2018-10-11 10:07:16 +05:30
Mike J Innes
ab0763fd41
Merge pull request #428 from tejank10/rnn-fixes
[WIP] Fixes for RNN tests
2018-10-10 16:58:44 +01:00
Tejan Karmali
8987e2c423 rm comments 2018-10-10 11:55:10 -04:00
Tejan Karmali
6b4bbd4fce reverted back the weights changes in rnndesc 2018-10-10 10:29:15 -04:00
Mike J Innes
9f6c3d5a2c fixes #403 2018-10-10 12:26:03 +01:00
Tejan Karmali
7b3e9c35ad changed index to view 2018-10-09 12:57:20 -04:00
JohnnyChen
de7623ac94 use variable assignment to do "copy" 2018-10-09 03:49:17 +08:00
JohnnyChen
eaacec852f Bug fix 2018-10-09 03:40:02 +08:00
JohnnyChen
27fec15fcc Add explicit copy(x::TrackedArray) method 2018-10-09 03:34:41 +08:00
Tejan Karmali
4d1a6c305b fixed params getting zero 2018-10-08 13:59:29 -04:00
JohnnyChen
36f5f274a5 Support copy(::TrackedArray)
1. fix issue https://github.com/FluxML/Flux.jl/issues/416
2. change test code to pass the test: some broken tests are not broken now...
2018-10-09 01:53:32 +08:00
Mike J Innes
73385b5dbd
Merge pull request #372 from johnnychen94/issue-#354
Type restriction for Dense layer
2018-10-05 15:03:03 +01:00
Mike Innes
c6740c5cdd fix unbroadcast 2018-10-05 14:14:43 +01:00
Mike J Innes
325d2ce212
Merge pull request #418 from c-p-murphy/add-fashion-mnist
Add FashionMNIST
2018-10-05 14:05:50 +01:00
Mike Innes
61fb6cdf05 jit macro 2018-10-05 14:02:00 +01:00
Mike Innes
69afdd61a6 avoid a warning 2018-10-05 13:59:58 +01:00
Mike Innes
bfe85e65f1 compose tweaks 2018-10-05 13:52:26 +01:00
Mike Innes
0f2019eba5 compose tweaks 2018-10-05 12:57:03 +01:00
Mike Innes
9bc9771a8d tweaks 2018-10-05 12:43:03 +01:00
Mike Innes
4abe518599 newline fixes 2018-10-05 12:37:47 +01:00
Tejan Karmali
2ff54ee0fd cudnn_available() update 2018-10-04 11:31:29 -04:00
Christopher Murphy
73a526b1de reuse utils from mnist.jl 2018-10-03 12:40:24 -04:00
Mike J Innes
683bbec71c
Merge pull request #413 from mcabbott/patch-2
evaluate both 2-ary DiffRules only when needed
2018-10-03 12:02:12 +01:00
Mike J Innes
fe6793fde5
closes #411 2018-10-03 11:45:29 +01:00
Robert Luciani
252e34e173 1.0+ updates - indices to axes, Vector init with undef 2018-10-02 21:39:00 +02:00
Christopher Murphy
95d72d7f79 update comments 2018-10-02 15:31:44 -04:00
Christopher Murphy
aff4c7898e add FashionMNIST 2018-10-01 15:26:26 -04:00
Dhairya Gandhi
b661db3797 added deprecations and compose 2018-10-01 05:30:53 +05:30
Michael Abbott
d25e05d9ee
evaluate both 2-ary DiffRules only when needed 2018-09-27 10:40:44 +02:00
Alex Bird
d131853587 add inv/ldivide/rdivide + test 2018-09-19 13:08:30 +01:00
Dhairya Gandhi
6665189ff1 added remaining optimizers and tests 2018-09-16 17:34:51 +05:30
Isaac Tay
e803117e25
updated loadparams! function 2018-09-15 16:45:04 +08:00
Avik Pal
eb9b408c0f
Merge branch 'master' into depthwiseconv 2018-09-15 10:21:31 +05:30
Mike J Innes
08fb9b7df1
Merge pull request #397 from FluxML/nest-bcast
Nested Derivatives of Broadcast
2018-09-14 20:23:28 +01:00
Mike Innes
d797999fc5 fix sentiment model 2018-09-14 18:10:24 +01:00
Dhairya Gandhi
63bc71698b updated tests 2018-09-14 20:32:56 +05:30
Dhairya Gandhi
4860c1d48b fixed white lines 2018-09-11 18:35:21 +05:30
Dhairya Gandhi
d933f2079b pulled tracker from upstream 2018-09-11 18:30:24 +05:30
Avik Pal
7d06f654f0 Fix tests 2018-09-11 16:58:05 +05:30
Avik Pal
7e7a501efd Fix tests 2018-09-11 16:32:14 +05:30
Avik Pal
c4f87ff15c Minor fixes: 2018-09-11 16:21:55 +05:30
Avik Pal
7e83852862 Fixes 2018-09-11 15:58:17 +05:30
Avik Pal
5fd8ffa47e CuRNN updates 2018-09-11 15:44:07 +05:30
Avik Pal
8bea60d980
Merge branch 'master' into cudnn_batchnorm 2018-09-11 15:34:25 +05:30
Tejan Karmali
e86365ed3f 1.0 fix for conv transpose 2018-09-08 15:44:06 -04:00
James Bradbury
e7783ace12 1.0 compat for normalise 2018-09-06 18:38:11 -07:00
Mike J Innes
6bbed07e96 enable nested broadcast 2018-09-07 02:05:03 +01:00
Dhairya Gandhi
0b440f16ff Merge branch 'master' of https://github.com/FluxML/Flux.jl 2018-09-06 22:48:03 +06:00
Johnny Chen
44049ce00c
Merge branch 'master' into issue-#354 2018-09-06 09:39:31 -05:00
Mike J Innes
5e4ee827e9
Merge pull request #371 from johnnychen94/issue-#323
Fix issue #323
2018-09-06 15:28:15 +01:00
Mike J Innes
ec16a2c77d todone: nicer syntax on 0.7 2018-09-05 15:55:08 +01:00
Mike J Innes
1e0fd07b09 use expand 2018-09-04 14:30:02 +01:00
Mike J Innes
e6be639436 Merge branch 'master' into HEAD 2018-09-04 14:03:46 +01:00
Mike J Innes
93c4a6b4b5 fixes #343 2018-09-04 13:37:54 +01:00
Mike J Innes
a2d2d068aa initial sketch 2018-08-28 17:55:59 +05:30
Mike Innes
53be49b102 fix #377 2018-08-28 11:02:38 +01:00
Mike J Innes
fac06751ea
Merge pull request #361 from dhairyagandhi96/with_stop
Add stop() to train loop when callback conditions are met
2018-08-28 10:56:15 +01:00
Mike Innes
2ca189bc96 newlines 2018-08-28 10:54:50 +01:00
Dhairya Gandhi
89bca2d98d remove merge conflicts 2018-08-28 15:14:12 +05:30
Dhairya Gandhi
a964debd8a fixed example in docs 2018-08-28 15:02:47 +05:30
Johnny Chen
0c4fb9655a Fix a bug 2018-08-25 15:12:01 +08:00
Johnny Chen
4ac76c35b0 fix MethodError for == and ≈
```julia
param([2]).^2 == [4.0]
ERROR: MethodError: ==(::TrackedArray{…,Array{Float64,1}}, ::Array{Float64,1}) is ambiguous. Candidates:
  ==(x::TrackedArray, y) in Main.Flux.Tracker at /Users/jc/.julia/dev/Flux/src/tracker/array.jl:63
  ==(A::AbstractArray, B::AbstractArray) in Base at abstractarray.jl:1686
Possible fix, define
  ==(::TrackedArray, ::AbstractArray)
```
2018-08-25 14:51:40 +08:00
Mike Innes
7d6ec2365f fixes #367 2018-08-24 14:30:39 +01:00
Mike Innes
86cf22675f rewrite broadcast 2018-08-24 14:07:08 +01:00
Mike Innes
e13d28a7a2 cruft 2018-08-24 13:44:21 +01:00
Dhairya Gandhi
c035fe22d7 added deprecation warning 2018-08-24 13:08:03 +05:30
Yueh-Hua Tu
634d34686e Add new constructors and test 2018-08-24 10:31:13 +08:00
Mike J Innes
953280d57f
Merge pull request #364 from boathit/master
fix argmax and add test
2018-08-23 15:52:06 +01:00
Mike Innes
dcde6d2217 tweaks 2018-08-23 15:44:28 +01:00
Johnny Chen
c9d6b5648f Fix issue #354 2018-08-23 21:56:32 +08:00
Johnny Chen
6743d52d08 Fix issue #354 2018-08-23 21:34:11 +08:00
Johnny Chen
7bfe431321 Fix issue #323 2018-08-23 20:58:58 +08:00
boathit
6c97846551 rename argmax as onecold 2018-08-23 20:47:43 +08:00
Mike J Innes
6c355e93d2
Merge pull request #363 from pshashk/patch-1
Fix repeat
2018-08-23 11:28:13 +01:00
Mike Innes
9d1d5187f3 fix activations for 1.0 2018-08-23 10:56:31 +01:00
boathit
33c901c191 redo 2018-08-23 16:01:42 +08:00
boathit
5dca80bd68 fix argmax and batch deprecations 2018-08-23 13:17:58 +08:00
Dhairya Gandhi
2f1a9847fa deprecate :stop from optimizers; housekeeping 2018-08-22 21:25:26 +05:30
Dhairya Gandhi
a7ad620f01 exporting stop 2018-08-22 00:33:30 +05:30
Dhairya Gandhi
3d11322d37 fixed docstring and not exporting stop 2018-08-22 00:29:07 +05:30
Dhairya Gandhi
ed044e2df7 changes as requested 2018-08-21 23:22:20 +05:30
boathit
616ed194df fix argmax and add test 2018-08-21 11:29:57 +08:00
Mike Innes
216d278e7b fix mnist loader 2018-08-20 16:57:43 +01:00
Mike Innes
3cfecaa4db test cleanup 2018-08-20 15:38:25 +01:00
Mike Innes
e68b8765b6 broadcast fixes 2018-08-20 14:41:46 +01:00
pshashk
1115eda6af
repeat fix
ERROR: UndefVarError: A not defined
2018-08-20 16:11:56 +03:00
Dhairya Gandhi
1af7a53e1f housekeeping: removed commented code 2018-08-20 18:10:20 +05:30
Mike Innes
5a023a9ccc WIP 1.0 support
closes #353
2018-08-20 13:08:04 +01:00
Dhairya Gandhi
756207e782 added docs 2018-08-20 14:20:33 +05:30
Dhairya Gandhi
51578177a5 removed arguments from StopException 2018-08-20 14:08:23 +05:30
Dhairya Gandhi
df22bc5c8f removed argument from stop function 2018-08-20 14:02:09 +05:30
Dhairya Gandhi
06db6ed314 housekeeping: fixing typo 2018-08-20 13:48:28 +05:30
Dhairya Gandhi
394b4167ce moving stop to Optimise 2018-08-20 13:43:08 +05:30
Dhairya Gandhi
06aad375fc properly importing functions 2018-08-20 13:35:55 +05:30
Dhairya Gandhi
e239eb1105 properly importing functions 2018-08-20 13:30:05 +05:30
Dhairya Gandhi
1228e9c5e2 removed include statement 2018-08-19 22:55:14 +05:30
Dhairya Gandhi
9c98272cf0 catching exception 2018-08-19 17:38:00 +05:30
Dhairya Gandhi
257e2a7d2e checking exception 2018-08-19 17:11:11 +05:30
Dhairya Gandhi
5c42c8689c printing expception 2018-08-19 17:04:31 +05:30
Dhairya Gandhi
b0f83f93ff exported StopException 2018-08-19 16:41:13 +05:30
Dhairya Gandhi
a53a5c8350 exporting stop 2018-08-19 15:31:33 +05:30
Dhairya Gandhi
fbd82a6925 added end 2018-08-19 15:19:45 +05:30
Dhairya Gandhi
8229c8e045 modified training loop 2018-08-19 15:17:07 +05:30
Dhairya Gandhi
2aa057ec08 fixed throwing exception 2018-08-19 14:54:54 +05:30
Dominique Luna
f2021d41ac initn -> init 2018-08-18 14:18:50 -04:00
Dominique Luna
3f42301e07 recurrent bug fixes 2018-08-18 11:50:52 -04:00
Dhairya Gandhi
887bfad312 returning :stop 2018-08-18 08:28:47 +05:30
Dhairya Gandhi
65a5ecccd2 returning 2018-08-18 08:24:49 +05:30
Dhairya Gandhi
999b00b64d fixed typo 2018-08-17 19:45:10 +05:30
Dhairya Gandhi
0524964400 fixed typo 2018-08-17 19:40:48 +05:30
Dhairya Gandhi
8ad72e51ea added function to stop training 2018-08-17 19:33:51 +05:30
Dhairya Gandhi
24a3bce495 added stop to break training loop 2018-08-17 17:46:13 +05:30
femtocleaner[bot]
2d80f68087 Fix deprecations 2018-08-14 16:46:23 +00:00
Simon
a43127f881
fix copy_transpose! 2018-08-15 12:16:12 +02:00
ayush1999
4683e925d4 Final changes 2018-08-12 11:38:48 +01:00
Josh Christie
59bdff2cae Test 0.7 and 1.0 2018-08-11 14:58:29 +01:00
Josh Christie
c8307a0627 Use @info for logging 2018-08-11 14:42:33 +01:00
Josh Christie
710a65fe72 Fix back scalar with a Ref and fix diagonal test 2018-08-11 14:36:33 +01:00
Avik Pal
5db7a3a3ad Fix Optimizers 2018-08-11 18:23:47 +05:30
Avik Pal
355091b9d1 Merge removing conflicts 2018-08-11 18:01:27 +05:30
Josh Christie
837e03613f Updates for julia 1.0 2018-08-11 13:23:02 +01:00
Avik Pal
d3c78a80be Fix layers errors 2018-08-11 17:20:27 +05:30
Avik Pal
4bd13c448f Add updates for julia0.7 2018-08-11 15:23:40 +05:30
Josh Christie
5186e3ba18 Updates for julia 1.0 2018-08-11 10:51:07 +01:00
Avik Pal
3b448ce1ac
Merge branch 'master' into cudnn_batchnorm 2018-08-11 15:02:55 +05:30
Avik Pal
3affed8ef0 Remove track_kw 2018-08-10 03:21:05 +05:30
Mike J Innes
62d594af43 out of place gradients for collect 2018-08-07 22:09:20 +01:00
Avik Pal
a0ec472a4b
Merge branch 'master' into depthwiseconv 2018-08-08 01:20:37 +05:30
Mike J Innes
7103a0ed7d tweaks 2018-08-03 15:19:10 +01:00
pevnak
926411a449 removed most error, the only one in Fallbacks test persits 2018-08-03 15:14:25 +01:00
pevnak
c657d4e47f fixed the sum as suggested by mike 2018-08-03 15:14:25 +01:00
Simon Mandlik
02f343d44d fixed more dep warns, also in tests, but maximum, minimum and size in array.jl still need to be updated. As a result, some more tests may not pass for the time being 2018-08-03 15:14:25 +01:00
Simon Mandlik
0471c489e6 depwarns 2018-08-03 15:14:25 +01:00
pevnak
3510c837a8 zeros replaced by zero 2018-08-03 15:14:25 +01:00
pevnak
ea38c7dbea some more changes 2018-08-03 15:14:25 +01:00
pevnak
d6f5baee39 fixed fixes proposed by Carlo 2018-08-03 15:14:25 +01:00
pevnak
8ab209126d removed zeros fix 2018-08-03 15:14:25 +01:00
pevnak
e98538673a updated sum to be compliant with latest beta. Removed some depwarns 2018-08-03 15:14:25 +01:00
Mike J Innes
e5b3d27016 track_kw should be unnecessary 2018-08-03 15:14:10 +01:00
Avik Pal
4d17a1a809
Merge branch 'master' into depthwiseconv 2018-08-03 19:41:50 +05:30
Avik Pal
6a41f823c8 Update track function 2018-08-03 19:06:05 +05:30
Avik Pal
b4ba7df03a Merge branch 'master' of https://github.com/FluxML/Flux.jl into cudnn_batchnorm 2018-08-03 18:55:46 +05:30
Mike Innes
f5c9361617 matmul fix 2018-08-03 13:02:47 +01:00
Mike Innes
4cf6bac0c1 fix hook 2018-08-03 13:02:47 +01:00
Mike J Innes
70718e7a64 update treelike 2018-08-03 13:02:47 +01:00
Mike J Innes
d782b33701 syntax 2018-08-03 13:02:47 +01:00
Mike J Innes
85fd77d70a linalg deprecations 2018-08-03 13:02:47 +01:00
Mike J Innes
89872c5a8b val deprecations 2018-08-03 13:02:47 +01:00
Mike J Innes
474f578517 ObjectIdDict -> IdDict 2018-08-03 13:02:47 +01:00
Mike J Innes
aa209ee137 no longer needed 2018-08-03 13:02:47 +01:00
Mike J Innes
00cfe24d66 fix cat 2018-08-03 13:02:47 +01:00
Mike J Innes
adc216f182 fix broadcasting 2018-08-03 12:56:32 +01:00
Mike J Innes
e486c50610 fix data 2018-08-03 12:56:31 +01:00
Mike J Innes
fb8a220659 fix matmul 2018-08-03 12:56:31 +01:00
Mike J Innes
7057ca739e fix std usage 2018-08-03 12:56:27 +01:00
Mike J Innes
88a265154c deprecations 2018-08-03 12:54:31 +01:00
Mike J Innes
b18b51656c requires update 2018-08-03 12:54:24 +01:00
Mike J Innes
a49e2eae41 deprecated Void 2018-08-03 12:53:52 +01:00
Mike J Innes
1fd49c2a90 fix array show 2018-08-03 12:53:52 +01:00
Yueh-Hua Tu
5b37319289 Add Maxpool and Meanpool 2018-08-01 00:10:53 +08:00
Mike J Innes
a8ccc79f61 perf hacks 2018-07-30 20:08:44 +01:00
Avik Pal
2cc0f112f1 Updates 2018-07-27 20:12:49 +05:30
Avik Pal
7dd5ec16c9 Fix 2018-07-17 11:22:12 +05:30
Avik Pal
531ecccd38 Error statement 2018-07-17 10:14:23 +05:30
Avik Pal
4035641f00 Remove imports 2018-07-17 10:06:26 +05:30
Avik Pal
0bb3eaa1f6 Update CUDNN Batchnorm with new Flux AD 2018-07-17 09:40:20 +05:30
Avik Pal
646db81f94 Pull BatchNorm CPU updates 2018-07-17 09:24:38 +05:30
CarloLucibello
071dcdda87 update docs 2018-07-16 07:32:13 +02:00
CarloLucibello
185e9148b6 fix cpu batchnorm 2018-07-16 07:11:33 +02:00
Avik Pal
2664a16556 Update as per new AD 2018-07-13 14:12:46 +05:30
Avik Pal
0aabf9d86b
Merge branch 'master' into depthwiseconv 2018-07-13 14:04:19 +05:30
Mike J Innes
a0fd91b866
Merge pull request #307 from jarvist/master
Add ADAMW "Fixing Weight Decay Regularization in Adam"
2018-07-11 19:12:58 +01:00
Mike J Innes
dda51a0140 update docs 2018-07-11 15:31:22 +01:00
Mike Innes
10a169bb77 update cudnn rnn 2018-07-10 18:16:37 +01:00
Mike J Innes
70b5efeb4e basic nested AD 2018-07-10 09:03:09 +01:00
Mike J Innes
80af9a3830 broadcast efficiency 2018-07-09 23:40:07 +01:00
Mike J Innes
e763c342ee shave some memory 2018-07-09 19:44:14 +01:00
Mike J Innes
1430053b69 checkpoints 2018-07-09 17:52:34 +01:00
Mike J Innes
7778d17884 functional API 2018-07-09 16:57:44 +01:00
Mike J Innes
5e319c7395 fix gradient definitions 2018-07-09 13:39:10 +01:00
Mike J Innes
41b9412439 new grad api 2018-07-09 13:36:46 +01:00
Jarvist Moore Frost
344a750770 Merge branch 'master' of github.com:jarvist/Flux.jl into HEAD 2018-07-03 11:15:43 +01:00
Jarvist Moore Frost
aee4a83c55 Add ADAMW weight-decay.
See http://www.fast.ai/2018/07/02/adam-weight-decay/ and the original
paper https://arxiv.org/abs/1711.05101.pdf for context.

I don't know what I'm doing, and this is quite possibly wrong - but on
a simple Char-RNN I have lying around on my harddisk, this seems to
improve the rate of learning consistently for different hyperparameters
vs. standard ADAM with the same decay constant.
2018-07-03 11:11:32 +01:00
Mike J Innes
ce88273880 gradient hook 2018-07-02 13:19:13 +01:00
Mike Innes
5d8b63dc65 avoid implementation details in docs 2018-06-29 13:53:50 +01:00
Avik Pal
e3b10691d2 make cache optional param 2018-06-28 15:27:59 +05:30
Avik Pal
bcf094451c Fix typo 2018-06-28 14:45:35 +05:30
Avik Pal
d0b79e71e2 fix load error 2018-06-28 14:27:50 +05:30
Avik Pal
7ac9e191cb Revert 1 change 2018-06-28 14:25:22 +05:30
Avik Pal
5ccde88ce6 Minor fix for 5D support 2018-06-28 14:21:17 +05:30
Avik Pal
681d8c4dfc Remove cache 2018-06-28 12:11:32 +05:30
Avik Pal
8f43258ab7 Get the batchnorm working without cache 2018-06-28 12:04:25 +05:30
Avik Pal
4916c8e6da Add treelike for now 2018-06-27 14:54:49 +05:30
Matthew Kelley
864d72eef5 Overload Base.eps() for TrackedReal 2018-06-26 23:55:43 -06:00
Matthew Kelley
0e95be3326 Call Flux.Tracker.data() on ŷ for bce 2018-06-26 14:48:51 -06:00
Matthew Kelley
ed032cdb1e Change epsilon value to eps(ŷ) 2018-06-26 12:29:06 -06:00
Matthew Kelley
e08fd7a6d2 Added epsilon term to binarycrossentropy 2018-06-26 11:43:16 -06:00
Mike J Innes
88c16e62dd fixes #284 2018-06-26 15:09:26 +01:00
Mike J Innes
836e3872b6 style 2018-06-26 15:09:21 +01:00
Mike J Innes
2723c9ee04
Merge pull request #257 from staticfloat/sf/back_inf_nan
Check for `Inf` and `NaN` within `back!(::TrackedReal)`
2018-06-26 14:42:33 +01:00
Mike J Innes
0a04e3ba61 Chain activations 2018-06-26 14:30:46 +01:00
Mike J Innes
7726a5b605 inferrable 2018-06-26 14:12:57 +01:00
Mike J Innes
3b575930ca Merge branch 'master' into scalar_pad_stride 2018-06-26 14:05:07 +01:00
Mike Innes
7e3cf45ee4 better error 2018-06-25 11:36:52 +01:00
Avik Pal
24ba1c4e6c Make changes as per the review 2018-06-23 11:02:41 +05:30
Mike J Innes
aea1e73cde scalar gradients 2018-06-21 13:12:42 +01:00
Avik Pal
91850a8baf Add missing path to curnn.jl 2018-06-20 18:46:42 +05:30
Avik Pal
deb4950261 Make cuDNN take only 4D arrays 2018-06-20 15:54:38 +05:30
Avik Pal
3339ad5181 Integrate cudnn BatchNorm with Flux 2018-06-20 15:50:30 +05:30
Avik Pal
714ca23aba Change default value of epsilon to prevent CuDNN BatchNorm warnings 2018-06-20 12:11:22 +05:30
Avik Pal
185f34d9fe Add working backward pass 2018-06-20 12:09:54 +05:30
Avik Pal
bc47d02b3f Remove uncessary imports 2018-06-17 12:40:01 +05:30
Avik Pal
af5ab7f9ef Fix Tensor Descriptor Bug 2018-06-17 12:28:02 +05:30
Avik Pal
c6dcf079ce Update file structure and make function calls correct 2018-06-17 11:47:49 +05:30
Avik Pal
24d13ac326 Fix missing parenthesis 2018-06-12 21:32:56 +05:30
Avik Pal
f12e367cab Adding untested backward pass code 2018-06-12 18:26:09 +05:30
Avik Pal
a83e5d696d Typo 2018-06-12 17:51:52 +05:30
Avik Pal
d4b066fdf9 Forward Pass for BatchNorm Added 2018-06-12 17:49:21 +05:30
Avik Pal
65f2c33991
Merge pull request #2 from FluxML/master
rebase
2018-06-11 15:40:57 +05:30
Avik Pal
b59da95786 Merge branch 'depthwiseconv' of https://github.com/avik-pal/Flux.jl into depthwiseconv 2018-06-09 13:11:42 +05:30
Avik Pal
5d7ee884b8 Fix error while backpropagatio 2018-06-09 13:04:49 +05:30
Avik Pal
7f3d11cae0
Merge branch 'master' into depthwiseconv 2018-06-09 11:06:07 +05:30
Avik Pal
1d93fb8e59 Add new constructor and fix a typo in display 2018-06-09 11:02:15 +05:30
Tejan Karmali
d20771d6be
Default value of dilation
dilation should be 1 by default
2018-06-09 02:29:46 +05:30
Tejan Karmali
4a24b69976
Merge branch 'master' into nadam-opt 2018-06-08 16:54:41 +05:30
Mike J Innes
4915b0c8dd
Merge pull request #268 from staticfloat/patch-2
Add `dilation` kwarg to `Conv`
2018-06-07 13:49:02 +01:00
Mike J Innes
af8f3348eb
Merge pull request #270 from staticfloat/sf/tracked_repeat
Add `TrackedArray` support for `repeat(x; inner, outer)`
2018-06-06 17:34:58 +01:00
Mike Innes
2370bdbe91 see #205 2018-06-06 17:01:28 +01:00
Avik Pal
33a7f545b7
Merge branch 'master' into depthwiseconv 2018-05-30 15:58:35 +05:30
Avik Pal
cd6a0856d5 Adds support for Depthwise Convolutions 2018-05-30 15:53:57 +05:30
staticfloat@gmail.com
f390a39d77 Add TrackedArray support for repeat(x; inner, outer) 2018-05-22 17:41:05 -07:00
Elliot Saba
e6efca4bf4 Add dilation kwarg to Conv
Now that we have dilated convolution support in `NNlib`, this is enables support in Flux's `Conv` layer.
2018-05-21 13:44:13 -07:00
James Bradbury
af12f006f2
Use broadcast for dropout
Should be fast enough on GPU now that it's not going to be an optimization target again for a while. Hopefully isn't meaningfully slower on CPU?
2018-05-20 04:04:33 -07:00
staticfloat@gmail.com
9fdbe843ef Check for Inf and NaN within back!(::TrackedReal)
This is often checked for within user code, no reason to do that, let's
do it for them within `back!(::TrackedReal)`
2018-05-07 15:30:44 -07:00
Mike J Innes
24ad384a38
Merge pull request #243 from gustafsson/catdim
Support for hcat and cat
2018-05-07 13:04:31 +01:00
Mike Innes
ef9077d9fa style 2018-05-07 13:03:52 +01:00
Mike Innes
b59161a41e export Tracker again 2018-05-05 17:15:18 +01:00
Johan Gustafsson
5fc6190956 RowVector tests 2018-05-02 16:10:39 +02:00
Johan Gustafsson
94bb064a0f more tests of array promotion for concatenation
# Conflicts:
#	test/tracker.jl
2018-05-02 16:00:29 +02:00
Johan Gustafsson
1c189c62ed cat with multiple dims #156
Co-authored-by: americast <sayan.sinha@iitkgp.ac.in>
2018-05-02 15:59:46 +02:00
Johan Gustafsson
fb68529169 define back function right after forward function 2018-05-02 15:59:46 +02:00
Johan Gustafsson
509a2e59f6 cat promotions and mixed ranks 2018-05-02 15:59:46 +02:00
Johan Gustafsson
eaaf5fd34c vcat arrays with ndims>2 2018-05-02 15:59:46 +02:00
Johan Gustafsson
bcef5c4ab5 Support hcat and cat 2018-05-02 15:59:46 +02:00
Mike J Innes
7d7d89569c rm this deprecation for 0.6 2018-05-01 12:20:36 +01:00
Mike J Innes
9a7e6e9c5c hold off on some things 2018-05-01 12:18:56 +01:00
CarloLucibello
e186b958dd more exports 2018-05-01 12:13:14 +01:00
Mike J Innes
ee89a7797e
Merge pull request #245 from freeboson/adamax
Add AdaMax optimizer
2018-05-01 11:28:07 +01:00
Mike J Innes
5efbaddb97
Merge pull request #249 from ninjin/nin/minimum
[RFC] Backpropagation for `maximum` and `minimum`
2018-04-30 18:40:42 +01:00
Mike J Innes
73a51400b6 better error message 2018-04-30 12:09:15 +01:00
Pontus Stenetorp
cfd29b9c76 Backpropagation for maximum and minimum 2018-04-29 13:52:54 +01:00
Sujeet Akula
8c042bd522
element wise max() 2018-04-26 21:12:31 +10:00
Sujeet Akula
5e5f255f81
export typo 2018-04-26 17:42:04 +10:00
Sujeet Akula
4586bda5ab
export/test adamax 2018-04-26 17:40:11 +10:00
Sujeet Akula
b6508e2416
add adamax 2018-04-26 17:37:24 +10:00
Mike J Innes
baff20514d gpu broadcast fix 2018-04-17 18:05:58 +01:00
Mike J Innes
8f73dc6e14 fix gpu cross entropy 2018-04-17 17:56:47 +01:00
tejank10
2ef25775c6 removed extra expand and fixed bug 2018-04-16 01:18:26 +05:30
Mike Innes
d12fb98f2a nicer batchnorm shape error 2018-04-15 20:29:25 +01:00
tejank10
2f5473d435 added expand in conv constructor 2018-04-16 00:59:11 +05:30
Mike J Innes
8f29968c32
Merge pull request #207 from safnuk/pull-request/07b0f95d
BatchNorm for convolutions
2018-04-15 20:10:33 +01:00
Mike J Innes
683a73fed3 download info 2018-04-15 20:09:30 +01:00
Mike J Innes
5fd240f525 interface tweaks 2018-04-15 20:04:42 +01:00
Mike J Innes
73a0be3e04 Merge branch 'master' into pull-request/07b0f95d 2018-04-15 17:10:29 +01:00
Mike J Innes
642543808e
Merge pull request #226 from CarloLucibello/reshape
fix reshape
2018-04-15 16:53:21 +01:00
tejank10
b080f5c82e Scalar pad and stride 2018-04-15 20:32:40 +05:30
Mike J Innes
cb3ae8df6a rename normalise.jl 2018-04-15 15:45:46 +01:00
Mike J Innes
b05e755068 rm jit from cuda 2018-04-15 15:08:58 +01:00
tejank10
5cc681317a added stride for pooling in tracker 2018-04-15 15:07:04 +01:00
tejank10
f6097d58d6 Scalar pad/stride for Conv constructor 2018-04-15 12:15:41 +05:30
Mike Innes
9d7164f15f we'll do this differently 2018-04-14 02:09:35 +01:00
tejank10
65847bb745 moved epsilon into sqrt 2018-04-04 15:25:20 +05:30
tejank10
3ead662987 Update rule fixed 2018-04-04 15:18:44 +05:30
CarloLucibello
b415333233 fix reshape 2018-04-02 16:09:57 -04:00
tejank10
ea9b5471fa NADAM optimizer 2018-04-03 01:27:22 +05:30
Brad Safnuk
b9a66c679d Fix error in initialization of σ. 2018-03-22 22:20:21 -04:00
Brad Safnuk
35299d4621 Fix type instability when loading onto a gpu.
Also fixes Issue #216.
2018-03-22 21:32:32 -04:00
Mike J Innes
4320738d87 fix 2018-03-21 11:25:47 +00:00
Mike Innes
1c5f8e3534 ndims for shapes 2018-03-16 14:42:08 +00:00
Brad Safnuk
db2d9efb72 Update BatchNorm documentation 2018-03-15 21:59:38 -04:00
Brad Safnuk
6653ec86d9 Allow multidimensional inputs to batchnorm.
Can be used in conjunction with convolutional layers, in addition
to dense layers, with the same api.
2018-03-15 21:48:59 -04:00
Mike J Innes
e931552f7d
Merge pull request #200 from chengchingwen/repmat
implement `back` of `repmat`
2018-03-15 15:18:48 +00:00
Mike J Innes
5d7edb5aaa
Merge pull request #197 from chengchingwen/master
Implement `prod` for `TrackedArray`
2018-03-15 15:17:24 +00:00
boathit
2ec37790be eliminate ambiguity 2018-03-13 10:50:56 +08:00
boathit
ff2caf032c eliminate ambiguous 2018-03-12 22:48:16 +08:00
Mike J Innes
9ccbac8b80 jit gpu support 2018-03-07 19:18:27 +00:00
chengchingwen
43af3895b0 change prod implementation to avoid small xs 2018-03-07 21:03:13 +08:00
chengchingwen
c00f7f850f implement back of repmat 2018-03-07 20:43:59 +08:00
chengchingwen
7c721475c6 add gradient check for prod and fix dims in back(::typeof(prod),...) 2018-03-07 16:24:44 +08:00
Mike Innes
d21c313ea7 tweaks 2018-03-06 19:58:47 +00:00
Mike Innes
36baa7ec2c convnet primitives 2018-03-06 19:58:05 +00:00
Mike Innes
0802b4d5cf closes #198 2018-03-06 16:56:01 +00:00
Elliot Saba
6445295318 Better download detection 2018-03-06 08:45:45 -08:00
Elliot Saba
19f691d342 Use cache.julialang.org to store ML models
It's annoying that when third party servers go down our tests break.
Let's at least make sure that if our tests break due to server outages
it's our fault.
2018-03-06 08:03:21 -08:00
Mike Innes
3babeeb440 scalar hashing 2018-03-06 13:49:05 +00:00
chengchingwen
86d782a5ce implement prod for TrackedArray 2018-03-06 18:01:19 +08:00
Mike Innes
c95a97f6ae make epochs available 2018-03-06 03:01:56 +00:00
Mike Innes
432b9c3222 loadparams! 2018-03-06 02:45:31 +00:00
Mike Innes
65ed95190a fix 2018-03-05 23:44:25 +00:00
Mike Innes
bfd6a4c0ec cleaner interrupts 2018-03-05 23:05:45 +00:00
Mike Innes
5153cde847 move epochs 2018-03-05 22:56:22 +00:00
Mike J Innes
662439c164 closes #177 2018-03-05 17:24:46 +00:00
Elliot Saba
36295799ee Add permutedims() for tracked arrays 2018-03-02 10:22:28 -08:00
Mike J Innes
8019f789f8 use normal log 2018-03-01 16:35:49 +00:00
Mike J Innes
ac57fc3c26 use @ fix in a few places 2018-03-01 16:31:20 +00:00
Mike J Innes
c2fea2acf6 revert this 2018-02-28 23:06:53 +00:00
Mike J Innes
2eb38eedbf update gpu api 2018-02-28 22:51:08 +00:00
Mike J Innes
ccef9f4dd4 jit softmax 2018-02-28 22:07:35 +00:00
Mike J Innes
7606b1a399 single-batch convolution 2018-02-28 14:25:32 +00:00
Mike J Innes
6bdc2b37a9 inline call 2018-02-28 13:47:14 +00:00
Mike J Innes
a401f08cda compile layers 2018-02-27 22:40:51 +00:00
Mike J Innes
5a32976cbf basic compile step 2018-02-27 21:43:41 +00:00
Mike J Innes
bdb8aae107 move cache logic 2018-02-27 21:41:03 +00:00
Mike J Innes
2c74976602 more general 2018-02-27 01:25:40 +00:00
Mike J Innes
466b5c501a cpu/gpu conveniences 2018-02-26 23:10:59 +00:00
Mike J Innes
15d1d3256b conv api updates 2018-02-26 22:43:07 +00:00
Mike J Innes
54919b8dca rm deprecation 2018-02-22 00:23:02 +00:00
Mike J Innes
491785a681 ignore state in mapleaves 2018-02-22 00:22:51 +00:00
Mike J Innes
ec65e2cec7 fix printing 2018-02-22 00:21:48 +00:00
Mike J Innes
af2e6b7e1d fix 2018-02-22 00:15:38 +00:00
Mike J Innes
99b739cf00 fixes #176 2018-02-21 23:21:20 +00:00
Mike J Innes
e3b4b16e01
Merge pull request #178 from schmrlng/pull-request/e6f55641
Convert OneHot CuArrays to dense CuArrays before passing to CUDNN methods
2018-02-21 22:34:11 +00:00
Mike J Innes
6bdd283fbd no longer necessary 2018-02-21 22:29:31 +00:00
Iblis Lin
043fedde3c
introduce Reexport
- Reexporting NNlib

fix #180
2018-02-21 16:55:20 +08:00
Ed Schmerling
e6f556411a Convert OneHot CuArrays to dense CuArrays before passing to CUDNN methods 2018-02-19 17:32:15 -08:00
Mike J Innes
4035745f6e may help numerical tests 2018-02-19 12:51:02 +00:00
Mike J Innes
989adcdc7d gpu fix 2018-02-17 12:41:53 +00:00
Mike J Innes
11511982a4 numerical stability 2018-02-17 11:56:03 +00:00
Mike J Innes
e5791bc5f6 frequencies utility 2018-02-17 11:19:51 +00:00
Mike J Innes
e3b31b9b87
Merge pull request #169 from jessebett/jessechanges
Reshape with Tuple Dimensions and Kronecker Product
2018-02-16 14:16:42 +00:00
Mike J Innes
60f21d3ff2 don't override base method 2018-02-16 14:15:40 +00:00
Mike J Innes
5e861101f3 epochs util 2018-02-16 11:17:57 +00:00
Mike J Innes
7aa6854c64 more correct 2018-02-16 00:06:15 +00:00
Mike J Innes
ee3784964e fix for external modules 2018-02-15 22:27:00 +00:00
Mike J Innes
63862c2324 easier initialisation with weights 2018-02-15 20:52:29 +00:00
Mike J Innes
01c31e7fcc conv bias 2018-02-15 20:15:41 +00:00
Mike J Innes
bdd07a8bc6 fix 2018-02-14 22:34:11 +00:00
Mike J Innes
1b8b1cd7b1 check params by identity 2018-02-14 21:00:50 +00:00
Mike J Innes
5ea0ef6764 tracker fix 2018-02-13 16:15:36 +00:00
Mike J Innes
1baa7227e3 reorganise batches 2018-02-13 16:05:07 +00:00
Mike J Innes
34217b1fa2 Merge branch 'treebank' 2018-02-13 15:44:27 +00:00
Mike J Innes
49584fb72b rm logsigmoid 2018-02-13 14:52:29 +00:00
Mike J Innes
2f29733888 Merge branch 'master' into HEAD 2018-02-13 14:45:37 +00:00
Mike J Innes
8432d8db06 batchnorm fix 2018-02-13 14:02:35 +00:00
Mike J Innes
820cd3ae42 fixes #164 2018-02-13 13:31:35 +00:00
Mike J Innes
066cb45a38 remove old accuracy fn 2018-02-13 11:12:21 +00:00
Mike J Innes
236edbffec fixes #111 2018-02-13 10:20:38 +00:00
Mike J Innes
f22cfb5b43 re-enable printf 2018-02-12 15:05:09 +00:00
Mike J Innes
334ae9e1cb fixes #171 2018-02-12 12:31:15 +00:00
Mike J Innes
0b3c02fe8d document regularisation, fixes #160 2018-02-09 19:00:26 +00:00
Mike J Innes
0e0057b0c4 basics 2018-02-09 13:51:07 +00:00
jessebett
f84ee8eab0 reshape with tupled dimensions and kronecker product 2018-02-08 14:27:57 -05:00
Mike J Innes
70fbbf48fa humble beginnings of compiler 2018-02-08 18:11:26 +00:00
Mike J Innes
fc157a8c59 TrackedNumber -> TrackedReal 2018-02-08 17:18:40 +00:00
Mike J Innes
d1c56ca768 number fix 2018-02-08 17:04:48 +00:00
Mike J Innes
0f7a1ec022 test params funct 2018-02-08 16:13:20 +00:00
Mike J Innes
961de2ba44
Merge pull request #161 from FluxML/curnn
WIP: CUDNN RNNs
2018-02-08 13:06:52 +00:00
Iblis Lin
f7fdfbe3a9 fix params 2018-02-08 12:56:10 +00:00
Mike J Innes
fcbdc49d6b fix reserve usage 2018-02-08 10:27:26 +00:00
Mike J Innes
bc452fcd81 rewrite tests 2018-02-08 02:37:55 +00:00
Mike J Innes
d592f4e327 batch support 2018-02-08 01:45:48 +00:00
Mike J Innes
b8f148b012 hook up backward passes 2018-02-08 00:49:39 +00:00
Mike J Innes
a1d1930097 Merge branch 'master' into curnn 2018-02-07 23:23:02 +00:00
Mike J Innes
4511936a87 fixes #116 2018-02-07 23:21:04 +00:00
Mike J Innes
0ac924e8e1 fixups 2018-02-07 22:52:46 +00:00
Mike J Innes
39f7f8fdf3 tracked tuples 2018-02-07 22:21:42 +00:00
Mike J Innes
79e4e25fea seperate number type 2018-02-07 20:39:36 +00:00
Mike J Innes
282889970d seperate tracking infrastructure from array wrapper 2018-02-07 17:43:25 +00:00
Mike J Innes
30b3437c56 backward passes 2018-02-06 18:56:17 +00:00
Mike J Innes
f866fbe575 nullable c refactor 2018-02-06 15:01:48 +00:00
Mike J Innes
07e1b1e0a9 avoid val 2018-02-06 12:44:18 +00:00
boathit
7e37a96c6f Register back! for logsigmoid and implement (logit)binarycrossentropy 2018-02-06 19:36:16 +08:00
boathit
6e65789828 Register back! for logsigmoid and implement (logit)binarycrossentropy 2018-02-06 19:32:46 +08:00
Mike J Innes
a4bf5936b0 diagm 2018-02-05 18:29:35 +00:00
Mike J Innes
2fec75005d
Merge pull request #123 from GenaBitu/cat-fix
Added vcat for multiple TrackedVectors
2018-02-05 18:10:48 +00:00
Mike J Innes
47cebab26e test multiple inputs/dims 2018-02-05 18:09:54 +00:00
Mike J Innes
2a2475a9c2 get tracker graph 2018-02-05 17:40:07 +00:00
Mike J Innes
14086b8c2d train forward pass 2018-02-02 17:48:08 +00:00
Mike J Innes
9a6fcf057b hook up interface 2018-02-02 16:42:18 +00:00
Mike J Innes
b1c5786012 Merge branch 'master' into curnn 2018-02-02 15:56:44 +00:00
Mike J Innes
49e1e78f67 make data/value available 2018-02-02 15:56:04 +00:00
Mike J Innes
0f1e7b5578 update rnn structure 2018-02-01 20:57:39 +00:00
Mike J Innes
106502a75d typo 2018-01-31 21:57:04 +00:00
Mike J Innes
af3ccf85ff coagulate gates 2018-01-31 16:56:27 +00:00
Mike J Innes
4bfb603da6 gru forward 2018-01-31 13:46:55 +00:00
Mike J Innes
b1bb05403c basic forward pass 2018-01-30 18:18:37 +00:00
Mike J Innes
0b886507dc param offsets 2018-01-30 14:43:39 +00:00
Mike J Innes
af0c5523ff rnnTrainingReserveSize 2018-01-30 14:43:39 +00:00
Mike J Innes
3fb83d642d rnnWorkspaceSize 2018-01-30 14:43:39 +00:00
Mike J Innes
6b4e114d5d rnnParamSize 2018-01-30 14:43:39 +00:00
Mike J Innes
ee6c3e18a9 basic RNNDesc 2018-01-30 14:43:39 +00:00
Mike J Innes
842bf03051 typo 2018-01-30 14:43:05 +00:00
Mike J Innes
0c9549c469 rm lazy 2018-01-24 13:28:52 +00:00
Mike J Innes
5118ef9163 remove batching work for now 2018-01-24 13:12:38 +00:00
boathit
374d7a5f1e Registering backward function for logsoftmax 2018-01-21 15:20:59 +08:00
Mike J Innes
72eabde373 load data 2018-01-17 16:39:55 +00:00
Mike J Innes
bd57359535 docstrings 2018-01-17 16:12:12 +00:00
Mike J Innes
8cca7accf2 mnist 2018-01-17 15:55:37 +00:00
Mike J Innes
4207fb98f2 basic GPU tests 2018-01-16 17:58:14 +00:00
GenaBitu
bc8a32bc56
Merge branch 'master' into cat-fix 2018-01-16 11:01:31 +01:00
Mike J Innes
1beb30e19a closes #118 2018-01-15 17:00:47 +00:00
Mike J Innes
8f8589a7f4 fix initialisation 2018-01-10 14:11:52 +00:00
Mike J Innes
b44237468e Merge branch 'master' into gru 2018-01-10 13:59:33 +00:00
Mike J Innes
805cb9178f fixes #146 2018-01-10 12:48:50 +00:00
Mehul Tikekar
2fef799109 fix typo in conv.jl (fixes #133) 2018-01-08 16:46:58 -05:00
Mike J Innes
468f641f66 use Adapt 2018-01-08 16:34:22 +00:00
Mike J Innes
98b362729d pool padding 2017-12-18 18:18:14 +00:00
Mike J Innes
e3577d759c conv docs 2017-12-18 18:05:48 +00:00
Mike J Innes
269d8f36b9 conv padding 2017-12-18 18:05:38 +00:00
Mike J Innes
51f93d9f0e conv polish 2017-12-15 16:24:45 +00:00
Mike J Innes
386eafc443 reshape 2017-12-15 16:18:16 +00:00
Mike J Innes
73ae25289d remove old util 2017-12-15 16:18:01 +00:00
Mike J Innes
6890a61587 todo 2017-12-15 16:17:45 +00:00
Mike J Innes
9b833a4345 more onehot indexing 2017-12-15 16:17:39 +00:00
Mike J Innes
9d0dd9fb7e layer wip 2017-12-15 13:22:57 +00:00
Mike J Innes
0bf22dfb8e pool gradients 2017-12-15 02:29:14 +00:00