Commit Graph

1299 Commits

Author SHA1 Message Date
Dhairya Gandhi
1af7a53e1f housekeeping: removed commented code 2018-08-20 18:10:20 +05:30
Mike Innes
5a023a9ccc WIP 1.0 support
closes #353
2018-08-20 13:08:04 +01:00
Dhairya Gandhi
756207e782 added docs 2018-08-20 14:20:33 +05:30
Dhairya Gandhi
51578177a5 removed arguments from StopException 2018-08-20 14:08:23 +05:30
Dhairya Gandhi
df22bc5c8f removed argument from stop function 2018-08-20 14:02:09 +05:30
Dhairya Gandhi
06db6ed314 housekeeping: fixing typo 2018-08-20 13:48:28 +05:30
Dhairya Gandhi
394b4167ce moving stop to Optimise 2018-08-20 13:43:08 +05:30
Dhairya Gandhi
06aad375fc properly importing functions 2018-08-20 13:35:55 +05:30
Dhairya Gandhi
e239eb1105 properly importing functions 2018-08-20 13:30:05 +05:30
Dhairya Gandhi
1228e9c5e2 removed include statement 2018-08-19 22:55:14 +05:30
Dhairya Gandhi
9c98272cf0 catching exception 2018-08-19 17:38:00 +05:30
Dhairya Gandhi
257e2a7d2e checking exception 2018-08-19 17:11:11 +05:30
Dhairya Gandhi
5c42c8689c printing expception 2018-08-19 17:04:31 +05:30
Dhairya Gandhi
b0f83f93ff exported StopException 2018-08-19 16:41:13 +05:30
Dhairya Gandhi
a53a5c8350 exporting stop 2018-08-19 15:31:33 +05:30
Dhairya Gandhi
fbd82a6925 added end 2018-08-19 15:19:45 +05:30
Dhairya Gandhi
8229c8e045 modified training loop 2018-08-19 15:17:07 +05:30
Dhairya Gandhi
2aa057ec08 fixed throwing exception 2018-08-19 14:54:54 +05:30
Dominique Luna
f2021d41ac initn -> init 2018-08-18 14:18:50 -04:00
Dominique Luna
3f42301e07 recurrent bug fixes 2018-08-18 11:50:52 -04:00
Dhairya Gandhi
887bfad312 returning :stop 2018-08-18 08:28:47 +05:30
Dhairya Gandhi
65a5ecccd2 returning 2018-08-18 08:24:49 +05:30
Dhairya Gandhi
999b00b64d fixed typo 2018-08-17 19:45:10 +05:30
Dhairya Gandhi
0524964400 fixed typo 2018-08-17 19:40:48 +05:30
Dhairya Gandhi
8ad72e51ea added function to stop training 2018-08-17 19:33:51 +05:30
Dhairya Gandhi
24a3bce495 added stop to break training loop 2018-08-17 17:46:13 +05:30
femtocleaner[bot]
2d80f68087 Fix deprecations 2018-08-14 16:46:23 +00:00
Simon
a43127f881
fix copy_transpose! 2018-08-15 12:16:12 +02:00
ayush1999
4683e925d4 Final changes 2018-08-12 11:38:48 +01:00
Josh Christie
59bdff2cae Test 0.7 and 1.0 2018-08-11 14:58:29 +01:00
Josh Christie
c8307a0627 Use @info for logging 2018-08-11 14:42:33 +01:00
Josh Christie
710a65fe72 Fix back scalar with a Ref and fix diagonal test 2018-08-11 14:36:33 +01:00
Avik Pal
5db7a3a3ad Fix Optimizers 2018-08-11 18:23:47 +05:30
Avik Pal
355091b9d1 Merge removing conflicts 2018-08-11 18:01:27 +05:30
Josh Christie
837e03613f Updates for julia 1.0 2018-08-11 13:23:02 +01:00
Avik Pal
d3c78a80be Fix layers errors 2018-08-11 17:20:27 +05:30
Avik Pal
4bd13c448f Add updates for julia0.7 2018-08-11 15:23:40 +05:30
Josh Christie
5186e3ba18 Updates for julia 1.0 2018-08-11 10:51:07 +01:00
Avik Pal
3b448ce1ac
Merge branch 'master' into cudnn_batchnorm 2018-08-11 15:02:55 +05:30
Avik Pal
3affed8ef0 Remove track_kw 2018-08-10 03:21:05 +05:30
Mike J Innes
62d594af43 out of place gradients for collect 2018-08-07 22:09:20 +01:00
Avik Pal
a0ec472a4b
Merge branch 'master' into depthwiseconv 2018-08-08 01:20:37 +05:30
Mike J Innes
7103a0ed7d tweaks 2018-08-03 15:19:10 +01:00
pevnak
926411a449 removed most error, the only one in Fallbacks test persits 2018-08-03 15:14:25 +01:00
pevnak
c657d4e47f fixed the sum as suggested by mike 2018-08-03 15:14:25 +01:00
Simon Mandlik
02f343d44d fixed more dep warns, also in tests, but maximum, minimum and size in array.jl still need to be updated. As a result, some more tests may not pass for the time being 2018-08-03 15:14:25 +01:00
Simon Mandlik
0471c489e6 depwarns 2018-08-03 15:14:25 +01:00
pevnak
3510c837a8 zeros replaced by zero 2018-08-03 15:14:25 +01:00
pevnak
ea38c7dbea some more changes 2018-08-03 15:14:25 +01:00
pevnak
d6f5baee39 fixed fixes proposed by Carlo 2018-08-03 15:14:25 +01:00
pevnak
8ab209126d removed zeros fix 2018-08-03 15:14:25 +01:00
pevnak
e98538673a updated sum to be compliant with latest beta. Removed some depwarns 2018-08-03 15:14:25 +01:00
Mike J Innes
e5b3d27016 track_kw should be unnecessary 2018-08-03 15:14:10 +01:00
Avik Pal
4d17a1a809
Merge branch 'master' into depthwiseconv 2018-08-03 19:41:50 +05:30
Avik Pal
6a41f823c8 Update track function 2018-08-03 19:06:05 +05:30
Avik Pal
b4ba7df03a Merge branch 'master' of https://github.com/FluxML/Flux.jl into cudnn_batchnorm 2018-08-03 18:55:46 +05:30
Mike Innes
f5c9361617 matmul fix 2018-08-03 13:02:47 +01:00
Mike Innes
4cf6bac0c1 fix hook 2018-08-03 13:02:47 +01:00
Mike J Innes
70718e7a64 update treelike 2018-08-03 13:02:47 +01:00
Mike J Innes
d782b33701 syntax 2018-08-03 13:02:47 +01:00
Mike J Innes
85fd77d70a linalg deprecations 2018-08-03 13:02:47 +01:00
Mike J Innes
89872c5a8b val deprecations 2018-08-03 13:02:47 +01:00
Mike J Innes
474f578517 ObjectIdDict -> IdDict 2018-08-03 13:02:47 +01:00
Mike J Innes
aa209ee137 no longer needed 2018-08-03 13:02:47 +01:00
Mike J Innes
00cfe24d66 fix cat 2018-08-03 13:02:47 +01:00
Mike J Innes
adc216f182 fix broadcasting 2018-08-03 12:56:32 +01:00
Mike J Innes
e486c50610 fix data 2018-08-03 12:56:31 +01:00
Mike J Innes
fb8a220659 fix matmul 2018-08-03 12:56:31 +01:00
Mike J Innes
7057ca739e fix std usage 2018-08-03 12:56:27 +01:00
Mike J Innes
88a265154c deprecations 2018-08-03 12:54:31 +01:00
Mike J Innes
b18b51656c requires update 2018-08-03 12:54:24 +01:00
Mike J Innes
a49e2eae41 deprecated Void 2018-08-03 12:53:52 +01:00
Mike J Innes
1fd49c2a90 fix array show 2018-08-03 12:53:52 +01:00
Yueh-Hua Tu
5b37319289 Add Maxpool and Meanpool 2018-08-01 00:10:53 +08:00
Mike J Innes
a8ccc79f61 perf hacks 2018-07-30 20:08:44 +01:00
Avik Pal
2cc0f112f1 Updates 2018-07-27 20:12:49 +05:30
Avik Pal
7dd5ec16c9 Fix 2018-07-17 11:22:12 +05:30
Avik Pal
531ecccd38 Error statement 2018-07-17 10:14:23 +05:30
Avik Pal
4035641f00 Remove imports 2018-07-17 10:06:26 +05:30
Avik Pal
0bb3eaa1f6 Update CUDNN Batchnorm with new Flux AD 2018-07-17 09:40:20 +05:30
Avik Pal
646db81f94 Pull BatchNorm CPU updates 2018-07-17 09:24:38 +05:30
CarloLucibello
071dcdda87 update docs 2018-07-16 07:32:13 +02:00
CarloLucibello
185e9148b6 fix cpu batchnorm 2018-07-16 07:11:33 +02:00
Avik Pal
2664a16556 Update as per new AD 2018-07-13 14:12:46 +05:30
Avik Pal
0aabf9d86b
Merge branch 'master' into depthwiseconv 2018-07-13 14:04:19 +05:30
Mike J Innes
a0fd91b866
Merge pull request #307 from jarvist/master
Add ADAMW "Fixing Weight Decay Regularization in Adam"
2018-07-11 19:12:58 +01:00
Mike J Innes
dda51a0140 update docs 2018-07-11 15:31:22 +01:00
Mike Innes
10a169bb77 update cudnn rnn 2018-07-10 18:16:37 +01:00
Mike J Innes
70b5efeb4e basic nested AD 2018-07-10 09:03:09 +01:00
Mike J Innes
80af9a3830 broadcast efficiency 2018-07-09 23:40:07 +01:00
Mike J Innes
e763c342ee shave some memory 2018-07-09 19:44:14 +01:00
Mike J Innes
1430053b69 checkpoints 2018-07-09 17:52:34 +01:00
Mike J Innes
7778d17884 functional API 2018-07-09 16:57:44 +01:00
Mike J Innes
5e319c7395 fix gradient definitions 2018-07-09 13:39:10 +01:00
Mike J Innes
41b9412439 new grad api 2018-07-09 13:36:46 +01:00
Jarvist Moore Frost
344a750770 Merge branch 'master' of github.com:jarvist/Flux.jl into HEAD 2018-07-03 11:15:43 +01:00
Jarvist Moore Frost
aee4a83c55 Add ADAMW weight-decay.
See http://www.fast.ai/2018/07/02/adam-weight-decay/ and the original
paper https://arxiv.org/abs/1711.05101.pdf for context.

I don't know what I'm doing, and this is quite possibly wrong - but on
a simple Char-RNN I have lying around on my harddisk, this seems to
improve the rate of learning consistently for different hyperparameters
vs. standard ADAM with the same decay constant.
2018-07-03 11:11:32 +01:00
Mike J Innes
ce88273880 gradient hook 2018-07-02 13:19:13 +01:00
Mike Innes
5d8b63dc65 avoid implementation details in docs 2018-06-29 13:53:50 +01:00
Avik Pal
e3b10691d2 make cache optional param 2018-06-28 15:27:59 +05:30
Avik Pal
bcf094451c Fix typo 2018-06-28 14:45:35 +05:30
Avik Pal
d0b79e71e2 fix load error 2018-06-28 14:27:50 +05:30
Avik Pal
7ac9e191cb Revert 1 change 2018-06-28 14:25:22 +05:30
Avik Pal
5ccde88ce6 Minor fix for 5D support 2018-06-28 14:21:17 +05:30
Avik Pal
681d8c4dfc Remove cache 2018-06-28 12:11:32 +05:30
Avik Pal
8f43258ab7 Get the batchnorm working without cache 2018-06-28 12:04:25 +05:30
Avik Pal
4916c8e6da Add treelike for now 2018-06-27 14:54:49 +05:30
Matthew Kelley
864d72eef5 Overload Base.eps() for TrackedReal 2018-06-26 23:55:43 -06:00
Matthew Kelley
0e95be3326 Call Flux.Tracker.data() on ŷ for bce 2018-06-26 14:48:51 -06:00
Matthew Kelley
ed032cdb1e Change epsilon value to eps(ŷ) 2018-06-26 12:29:06 -06:00
Matthew Kelley
e08fd7a6d2 Added epsilon term to binarycrossentropy 2018-06-26 11:43:16 -06:00
Mike J Innes
88c16e62dd fixes #284 2018-06-26 15:09:26 +01:00
Mike J Innes
836e3872b6 style 2018-06-26 15:09:21 +01:00
Mike J Innes
2723c9ee04
Merge pull request #257 from staticfloat/sf/back_inf_nan
Check for `Inf` and `NaN` within `back!(::TrackedReal)`
2018-06-26 14:42:33 +01:00
Mike J Innes
0a04e3ba61 Chain activations 2018-06-26 14:30:46 +01:00
Mike J Innes
7726a5b605 inferrable 2018-06-26 14:12:57 +01:00
Mike J Innes
3b575930ca Merge branch 'master' into scalar_pad_stride 2018-06-26 14:05:07 +01:00
Mike Innes
7e3cf45ee4 better error 2018-06-25 11:36:52 +01:00
Avik Pal
24ba1c4e6c Make changes as per the review 2018-06-23 11:02:41 +05:30
Mike J Innes
aea1e73cde scalar gradients 2018-06-21 13:12:42 +01:00
Avik Pal
91850a8baf Add missing path to curnn.jl 2018-06-20 18:46:42 +05:30
Avik Pal
deb4950261 Make cuDNN take only 4D arrays 2018-06-20 15:54:38 +05:30
Avik Pal
3339ad5181 Integrate cudnn BatchNorm with Flux 2018-06-20 15:50:30 +05:30
Avik Pal
714ca23aba Change default value of epsilon to prevent CuDNN BatchNorm warnings 2018-06-20 12:11:22 +05:30
Avik Pal
185f34d9fe Add working backward pass 2018-06-20 12:09:54 +05:30
Avik Pal
bc47d02b3f Remove uncessary imports 2018-06-17 12:40:01 +05:30
Avik Pal
af5ab7f9ef Fix Tensor Descriptor Bug 2018-06-17 12:28:02 +05:30
Avik Pal
c6dcf079ce Update file structure and make function calls correct 2018-06-17 11:47:49 +05:30
Avik Pal
24d13ac326 Fix missing parenthesis 2018-06-12 21:32:56 +05:30
Avik Pal
f12e367cab Adding untested backward pass code 2018-06-12 18:26:09 +05:30
Avik Pal
a83e5d696d Typo 2018-06-12 17:51:52 +05:30
Avik Pal
d4b066fdf9 Forward Pass for BatchNorm Added 2018-06-12 17:49:21 +05:30
Avik Pal
65f2c33991
Merge pull request #2 from FluxML/master
rebase
2018-06-11 15:40:57 +05:30
Avik Pal
b59da95786 Merge branch 'depthwiseconv' of https://github.com/avik-pal/Flux.jl into depthwiseconv 2018-06-09 13:11:42 +05:30
Avik Pal
5d7ee884b8 Fix error while backpropagatio 2018-06-09 13:04:49 +05:30
Avik Pal
7f3d11cae0
Merge branch 'master' into depthwiseconv 2018-06-09 11:06:07 +05:30
Avik Pal
1d93fb8e59 Add new constructor and fix a typo in display 2018-06-09 11:02:15 +05:30
Tejan Karmali
d20771d6be
Default value of dilation
dilation should be 1 by default
2018-06-09 02:29:46 +05:30
Tejan Karmali
4a24b69976
Merge branch 'master' into nadam-opt 2018-06-08 16:54:41 +05:30
Mike J Innes
4915b0c8dd
Merge pull request #268 from staticfloat/patch-2
Add `dilation` kwarg to `Conv`
2018-06-07 13:49:02 +01:00
Mike J Innes
af8f3348eb
Merge pull request #270 from staticfloat/sf/tracked_repeat
Add `TrackedArray` support for `repeat(x; inner, outer)`
2018-06-06 17:34:58 +01:00
Mike Innes
2370bdbe91 see #205 2018-06-06 17:01:28 +01:00
Avik Pal
33a7f545b7
Merge branch 'master' into depthwiseconv 2018-05-30 15:58:35 +05:30
Avik Pal
cd6a0856d5 Adds support for Depthwise Convolutions 2018-05-30 15:53:57 +05:30
staticfloat@gmail.com
f390a39d77 Add TrackedArray support for repeat(x; inner, outer) 2018-05-22 17:41:05 -07:00
Elliot Saba
e6efca4bf4 Add dilation kwarg to Conv
Now that we have dilated convolution support in `NNlib`, this is enables support in Flux's `Conv` layer.
2018-05-21 13:44:13 -07:00
James Bradbury
af12f006f2
Use broadcast for dropout
Should be fast enough on GPU now that it's not going to be an optimization target again for a while. Hopefully isn't meaningfully slower on CPU?
2018-05-20 04:04:33 -07:00
staticfloat@gmail.com
9fdbe843ef Check for Inf and NaN within back!(::TrackedReal)
This is often checked for within user code, no reason to do that, let's
do it for them within `back!(::TrackedReal)`
2018-05-07 15:30:44 -07:00
Mike J Innes
24ad384a38
Merge pull request #243 from gustafsson/catdim
Support for hcat and cat
2018-05-07 13:04:31 +01:00
Mike Innes
ef9077d9fa style 2018-05-07 13:03:52 +01:00
Mike Innes
b59161a41e export Tracker again 2018-05-05 17:15:18 +01:00
Johan Gustafsson
5fc6190956 RowVector tests 2018-05-02 16:10:39 +02:00
Johan Gustafsson
94bb064a0f more tests of array promotion for concatenation
# Conflicts:
#	test/tracker.jl
2018-05-02 16:00:29 +02:00
Johan Gustafsson
1c189c62ed cat with multiple dims #156
Co-authored-by: americast <sayan.sinha@iitkgp.ac.in>
2018-05-02 15:59:46 +02:00
Johan Gustafsson
fb68529169 define back function right after forward function 2018-05-02 15:59:46 +02:00
Johan Gustafsson
509a2e59f6 cat promotions and mixed ranks 2018-05-02 15:59:46 +02:00
Johan Gustafsson
eaaf5fd34c vcat arrays with ndims>2 2018-05-02 15:59:46 +02:00
Johan Gustafsson
bcef5c4ab5 Support hcat and cat 2018-05-02 15:59:46 +02:00
Mike J Innes
7d7d89569c rm this deprecation for 0.6 2018-05-01 12:20:36 +01:00
Mike J Innes
9a7e6e9c5c hold off on some things 2018-05-01 12:18:56 +01:00
CarloLucibello
e186b958dd more exports 2018-05-01 12:13:14 +01:00
Mike J Innes
ee89a7797e
Merge pull request #245 from freeboson/adamax
Add AdaMax optimizer
2018-05-01 11:28:07 +01:00
Mike J Innes
5efbaddb97
Merge pull request #249 from ninjin/nin/minimum
[RFC] Backpropagation for `maximum` and `minimum`
2018-04-30 18:40:42 +01:00
Mike J Innes
73a51400b6 better error message 2018-04-30 12:09:15 +01:00
Pontus Stenetorp
cfd29b9c76 Backpropagation for maximum and minimum 2018-04-29 13:52:54 +01:00
Sujeet Akula
8c042bd522
element wise max() 2018-04-26 21:12:31 +10:00
Sujeet Akula
5e5f255f81
export typo 2018-04-26 17:42:04 +10:00
Sujeet Akula
4586bda5ab
export/test adamax 2018-04-26 17:40:11 +10:00
Sujeet Akula
b6508e2416
add adamax 2018-04-26 17:37:24 +10:00
Mike J Innes
baff20514d gpu broadcast fix 2018-04-17 18:05:58 +01:00
Mike J Innes
8f73dc6e14 fix gpu cross entropy 2018-04-17 17:56:47 +01:00
tejank10
2ef25775c6 removed extra expand and fixed bug 2018-04-16 01:18:26 +05:30
Mike Innes
d12fb98f2a nicer batchnorm shape error 2018-04-15 20:29:25 +01:00
tejank10
2f5473d435 added expand in conv constructor 2018-04-16 00:59:11 +05:30
Mike J Innes
8f29968c32
Merge pull request #207 from safnuk/pull-request/07b0f95d
BatchNorm for convolutions
2018-04-15 20:10:33 +01:00
Mike J Innes
683a73fed3 download info 2018-04-15 20:09:30 +01:00
Mike J Innes
5fd240f525 interface tweaks 2018-04-15 20:04:42 +01:00
Mike J Innes
73a0be3e04 Merge branch 'master' into pull-request/07b0f95d 2018-04-15 17:10:29 +01:00
Mike J Innes
642543808e
Merge pull request #226 from CarloLucibello/reshape
fix reshape
2018-04-15 16:53:21 +01:00
tejank10
b080f5c82e Scalar pad and stride 2018-04-15 20:32:40 +05:30
Mike J Innes
cb3ae8df6a rename normalise.jl 2018-04-15 15:45:46 +01:00
Mike J Innes
b05e755068 rm jit from cuda 2018-04-15 15:08:58 +01:00
tejank10
5cc681317a added stride for pooling in tracker 2018-04-15 15:07:04 +01:00
tejank10
f6097d58d6 Scalar pad/stride for Conv constructor 2018-04-15 12:15:41 +05:30
Mike Innes
9d7164f15f we'll do this differently 2018-04-14 02:09:35 +01:00
tejank10
65847bb745 moved epsilon into sqrt 2018-04-04 15:25:20 +05:30
tejank10
3ead662987 Update rule fixed 2018-04-04 15:18:44 +05:30
CarloLucibello
b415333233 fix reshape 2018-04-02 16:09:57 -04:00
tejank10
ea9b5471fa NADAM optimizer 2018-04-03 01:27:22 +05:30
Brad Safnuk
b9a66c679d Fix error in initialization of σ. 2018-03-22 22:20:21 -04:00
Brad Safnuk
35299d4621 Fix type instability when loading onto a gpu.
Also fixes Issue #216.
2018-03-22 21:32:32 -04:00
Mike J Innes
4320738d87 fix 2018-03-21 11:25:47 +00:00
Mike Innes
1c5f8e3534 ndims for shapes 2018-03-16 14:42:08 +00:00
Brad Safnuk
db2d9efb72 Update BatchNorm documentation 2018-03-15 21:59:38 -04:00
Brad Safnuk
6653ec86d9 Allow multidimensional inputs to batchnorm.
Can be used in conjunction with convolutional layers, in addition
to dense layers, with the same api.
2018-03-15 21:48:59 -04:00
Mike J Innes
e931552f7d
Merge pull request #200 from chengchingwen/repmat
implement `back` of `repmat`
2018-03-15 15:18:48 +00:00
Mike J Innes
5d7edb5aaa
Merge pull request #197 from chengchingwen/master
Implement `prod` for `TrackedArray`
2018-03-15 15:17:24 +00:00
boathit
2ec37790be eliminate ambiguity 2018-03-13 10:50:56 +08:00
boathit
ff2caf032c eliminate ambiguous 2018-03-12 22:48:16 +08:00
Mike J Innes
9ccbac8b80 jit gpu support 2018-03-07 19:18:27 +00:00
chengchingwen
43af3895b0 change prod implementation to avoid small xs 2018-03-07 21:03:13 +08:00
chengchingwen
c00f7f850f implement back of repmat 2018-03-07 20:43:59 +08:00
chengchingwen
7c721475c6 add gradient check for prod and fix dims in back(::typeof(prod),...) 2018-03-07 16:24:44 +08:00
Mike Innes
d21c313ea7 tweaks 2018-03-06 19:58:47 +00:00
Mike Innes
36baa7ec2c convnet primitives 2018-03-06 19:58:05 +00:00
Mike Innes
0802b4d5cf closes #198 2018-03-06 16:56:01 +00:00
Elliot Saba
6445295318 Better download detection 2018-03-06 08:45:45 -08:00
Elliot Saba
19f691d342 Use cache.julialang.org to store ML models
It's annoying that when third party servers go down our tests break.
Let's at least make sure that if our tests break due to server outages
it's our fault.
2018-03-06 08:03:21 -08:00
Mike Innes
3babeeb440 scalar hashing 2018-03-06 13:49:05 +00:00
chengchingwen
86d782a5ce implement prod for TrackedArray 2018-03-06 18:01:19 +08:00
Mike Innes
c95a97f6ae make epochs available 2018-03-06 03:01:56 +00:00
Mike Innes
432b9c3222 loadparams! 2018-03-06 02:45:31 +00:00
Mike Innes
65ed95190a fix 2018-03-05 23:44:25 +00:00
Mike Innes
bfd6a4c0ec cleaner interrupts 2018-03-05 23:05:45 +00:00
Mike Innes
5153cde847 move epochs 2018-03-05 22:56:22 +00:00
Mike J Innes
662439c164 closes #177 2018-03-05 17:24:46 +00:00
Elliot Saba
36295799ee Add permutedims() for tracked arrays 2018-03-02 10:22:28 -08:00
Mike J Innes
8019f789f8 use normal log 2018-03-01 16:35:49 +00:00
Mike J Innes
ac57fc3c26 use @ fix in a few places 2018-03-01 16:31:20 +00:00
Mike J Innes
c2fea2acf6 revert this 2018-02-28 23:06:53 +00:00
Mike J Innes
2eb38eedbf update gpu api 2018-02-28 22:51:08 +00:00
Mike J Innes
ccef9f4dd4 jit softmax 2018-02-28 22:07:35 +00:00
Mike J Innes
7606b1a399 single-batch convolution 2018-02-28 14:25:32 +00:00
Mike J Innes
6bdc2b37a9 inline call 2018-02-28 13:47:14 +00:00
Mike J Innes
a401f08cda compile layers 2018-02-27 22:40:51 +00:00
Mike J Innes
5a32976cbf basic compile step 2018-02-27 21:43:41 +00:00
Mike J Innes
bdb8aae107 move cache logic 2018-02-27 21:41:03 +00:00
Mike J Innes
2c74976602 more general 2018-02-27 01:25:40 +00:00
Mike J Innes
466b5c501a cpu/gpu conveniences 2018-02-26 23:10:59 +00:00
Mike J Innes
15d1d3256b conv api updates 2018-02-26 22:43:07 +00:00
Mike J Innes
54919b8dca rm deprecation 2018-02-22 00:23:02 +00:00
Mike J Innes
491785a681 ignore state in mapleaves 2018-02-22 00:22:51 +00:00
Mike J Innes
ec65e2cec7 fix printing 2018-02-22 00:21:48 +00:00
Mike J Innes
af2e6b7e1d fix 2018-02-22 00:15:38 +00:00
Mike J Innes
99b739cf00 fixes #176 2018-02-21 23:21:20 +00:00
Mike J Innes
e3b4b16e01
Merge pull request #178 from schmrlng/pull-request/e6f55641
Convert OneHot CuArrays to dense CuArrays before passing to CUDNN methods
2018-02-21 22:34:11 +00:00
Mike J Innes
6bdd283fbd no longer necessary 2018-02-21 22:29:31 +00:00
Iblis Lin
043fedde3c
introduce Reexport
- Reexporting NNlib

fix #180
2018-02-21 16:55:20 +08:00
Ed Schmerling
e6f556411a Convert OneHot CuArrays to dense CuArrays before passing to CUDNN methods 2018-02-19 17:32:15 -08:00
Mike J Innes
4035745f6e may help numerical tests 2018-02-19 12:51:02 +00:00
Mike J Innes
989adcdc7d gpu fix 2018-02-17 12:41:53 +00:00
Mike J Innes
11511982a4 numerical stability 2018-02-17 11:56:03 +00:00
Mike J Innes
e5791bc5f6 frequencies utility 2018-02-17 11:19:51 +00:00
Mike J Innes
e3b31b9b87
Merge pull request #169 from jessebett/jessechanges
Reshape with Tuple Dimensions and Kronecker Product
2018-02-16 14:16:42 +00:00
Mike J Innes
60f21d3ff2 don't override base method 2018-02-16 14:15:40 +00:00
Mike J Innes
5e861101f3 epochs util 2018-02-16 11:17:57 +00:00
Mike J Innes
7aa6854c64 more correct 2018-02-16 00:06:15 +00:00
Mike J Innes
ee3784964e fix for external modules 2018-02-15 22:27:00 +00:00
Mike J Innes
63862c2324 easier initialisation with weights 2018-02-15 20:52:29 +00:00
Mike J Innes
01c31e7fcc conv bias 2018-02-15 20:15:41 +00:00
Mike J Innes
bdd07a8bc6 fix 2018-02-14 22:34:11 +00:00
Mike J Innes
1b8b1cd7b1 check params by identity 2018-02-14 21:00:50 +00:00
Mike J Innes
5ea0ef6764 tracker fix 2018-02-13 16:15:36 +00:00
Mike J Innes
1baa7227e3 reorganise batches 2018-02-13 16:05:07 +00:00
Mike J Innes
34217b1fa2 Merge branch 'treebank' 2018-02-13 15:44:27 +00:00
Mike J Innes
49584fb72b rm logsigmoid 2018-02-13 14:52:29 +00:00
Mike J Innes
2f29733888 Merge branch 'master' into HEAD 2018-02-13 14:45:37 +00:00
Mike J Innes
8432d8db06 batchnorm fix 2018-02-13 14:02:35 +00:00
Mike J Innes
820cd3ae42 fixes #164 2018-02-13 13:31:35 +00:00
Mike J Innes
066cb45a38 remove old accuracy fn 2018-02-13 11:12:21 +00:00
Mike J Innes
236edbffec fixes #111 2018-02-13 10:20:38 +00:00
Mike J Innes
f22cfb5b43 re-enable printf 2018-02-12 15:05:09 +00:00
Mike J Innes
334ae9e1cb fixes #171 2018-02-12 12:31:15 +00:00
Mike J Innes
0b3c02fe8d document regularisation, fixes #160 2018-02-09 19:00:26 +00:00
Mike J Innes
0e0057b0c4 basics 2018-02-09 13:51:07 +00:00
jessebett
f84ee8eab0 reshape with tupled dimensions and kronecker product 2018-02-08 14:27:57 -05:00
Mike J Innes
70fbbf48fa humble beginnings of compiler 2018-02-08 18:11:26 +00:00
Mike J Innes
fc157a8c59 TrackedNumber -> TrackedReal 2018-02-08 17:18:40 +00:00
Mike J Innes
d1c56ca768 number fix 2018-02-08 17:04:48 +00:00
Mike J Innes
0f7a1ec022 test params funct 2018-02-08 16:13:20 +00:00
Mike J Innes
961de2ba44
Merge pull request #161 from FluxML/curnn
WIP: CUDNN RNNs
2018-02-08 13:06:52 +00:00
Iblis Lin
f7fdfbe3a9 fix params 2018-02-08 12:56:10 +00:00
Mike J Innes
fcbdc49d6b fix reserve usage 2018-02-08 10:27:26 +00:00
Mike J Innes
bc452fcd81 rewrite tests 2018-02-08 02:37:55 +00:00
Mike J Innes
d592f4e327 batch support 2018-02-08 01:45:48 +00:00
Mike J Innes
b8f148b012 hook up backward passes 2018-02-08 00:49:39 +00:00
Mike J Innes
a1d1930097 Merge branch 'master' into curnn 2018-02-07 23:23:02 +00:00
Mike J Innes
4511936a87 fixes #116 2018-02-07 23:21:04 +00:00
Mike J Innes
0ac924e8e1 fixups 2018-02-07 22:52:46 +00:00
Mike J Innes
39f7f8fdf3 tracked tuples 2018-02-07 22:21:42 +00:00
Mike J Innes
79e4e25fea seperate number type 2018-02-07 20:39:36 +00:00
Mike J Innes
282889970d seperate tracking infrastructure from array wrapper 2018-02-07 17:43:25 +00:00
Mike J Innes
30b3437c56 backward passes 2018-02-06 18:56:17 +00:00
Mike J Innes
f866fbe575 nullable c refactor 2018-02-06 15:01:48 +00:00
Mike J Innes
07e1b1e0a9 avoid val 2018-02-06 12:44:18 +00:00
boathit
7e37a96c6f Register back! for logsigmoid and implement (logit)binarycrossentropy 2018-02-06 19:36:16 +08:00
boathit
6e65789828 Register back! for logsigmoid and implement (logit)binarycrossentropy 2018-02-06 19:32:46 +08:00
Mike J Innes
a4bf5936b0 diagm 2018-02-05 18:29:35 +00:00
Mike J Innes
2fec75005d
Merge pull request #123 from GenaBitu/cat-fix
Added vcat for multiple TrackedVectors
2018-02-05 18:10:48 +00:00
Mike J Innes
47cebab26e test multiple inputs/dims 2018-02-05 18:09:54 +00:00
Mike J Innes
2a2475a9c2 get tracker graph 2018-02-05 17:40:07 +00:00
Mike J Innes
14086b8c2d train forward pass 2018-02-02 17:48:08 +00:00
Mike J Innes
9a6fcf057b hook up interface 2018-02-02 16:42:18 +00:00
Mike J Innes
b1c5786012 Merge branch 'master' into curnn 2018-02-02 15:56:44 +00:00
Mike J Innes
49e1e78f67 make data/value available 2018-02-02 15:56:04 +00:00
Mike J Innes
0f1e7b5578 update rnn structure 2018-02-01 20:57:39 +00:00
Mike J Innes
106502a75d typo 2018-01-31 21:57:04 +00:00
Mike J Innes
af3ccf85ff coagulate gates 2018-01-31 16:56:27 +00:00
Mike J Innes
4bfb603da6 gru forward 2018-01-31 13:46:55 +00:00
Mike J Innes
b1bb05403c basic forward pass 2018-01-30 18:18:37 +00:00