Dhairya Gandhi
4860c1d48b
fixed white lines
2018-09-11 18:35:21 +05:30
Dhairya Gandhi
d933f2079b
pulled tracker from upstream
2018-09-11 18:30:24 +05:30
Avik Pal
7d06f654f0
Fix tests
2018-09-11 16:58:05 +05:30
Avik Pal
7e7a501efd
Fix tests
2018-09-11 16:32:14 +05:30
Avik Pal
c4f87ff15c
Minor fixes:
2018-09-11 16:21:55 +05:30
Avik Pal
7e83852862
Fixes
2018-09-11 15:58:17 +05:30
Avik Pal
5fd8ffa47e
CuRNN updates
2018-09-11 15:44:07 +05:30
Avik Pal
8bea60d980
Merge branch 'master' into cudnn_batchnorm
2018-09-11 15:34:25 +05:30
Tejan Karmali
e86365ed3f
1.0 fix for conv transpose
2018-09-08 15:44:06 -04:00
James Bradbury
e7783ace12
1.0 compat for normalise
2018-09-06 18:38:11 -07:00
Mike J Innes
6bbed07e96
enable nested broadcast
2018-09-07 02:05:03 +01:00
Dhairya Gandhi
0b440f16ff
Merge branch 'master' of https://github.com/FluxML/Flux.jl
2018-09-06 22:48:03 +06:00
Johnny Chen
44049ce00c
Merge branch 'master' into issue-#354
2018-09-06 09:39:31 -05:00
Mike J Innes
5e4ee827e9
Merge pull request #371 from johnnychen94/issue-#323
...
Fix issue #323
2018-09-06 15:28:15 +01:00
Mike J Innes
ec16a2c77d
todone: nicer syntax on 0.7
2018-09-05 15:55:08 +01:00
Mike J Innes
1e0fd07b09
use expand
2018-09-04 14:30:02 +01:00
Mike J Innes
e6be639436
Merge branch 'master' into HEAD
2018-09-04 14:03:46 +01:00
Mike J Innes
93c4a6b4b5
fixes #343
2018-09-04 13:37:54 +01:00
Mike J Innes
a2d2d068aa
initial sketch
2018-08-28 17:55:59 +05:30
Mike Innes
53be49b102
fix #377
2018-08-28 11:02:38 +01:00
Mike J Innes
fac06751ea
Merge pull request #361 from dhairyagandhi96/with_stop
...
Add stop() to train loop when callback conditions are met
2018-08-28 10:56:15 +01:00
Mike Innes
2ca189bc96
newlines
2018-08-28 10:54:50 +01:00
Dhairya Gandhi
89bca2d98d
remove merge conflicts
2018-08-28 15:14:12 +05:30
Dhairya Gandhi
a964debd8a
fixed example in docs
2018-08-28 15:02:47 +05:30
Johnny Chen
0c4fb9655a
Fix a bug
2018-08-25 15:12:01 +08:00
Johnny Chen
4ac76c35b0
fix MethodError for == and ≈
...
```julia
param([2]).^2 == [4.0]
ERROR: MethodError: ==(::TrackedArray{…,Array{Float64,1}}, ::Array{Float64,1}) is ambiguous. Candidates:
==(x::TrackedArray, y) in Main.Flux.Tracker at /Users/jc/.julia/dev/Flux/src/tracker/array.jl:63
==(A::AbstractArray, B::AbstractArray) in Base at abstractarray.jl:1686
Possible fix, define
==(::TrackedArray, ::AbstractArray)
```
2018-08-25 14:51:40 +08:00
Mike Innes
7d6ec2365f
fixes #367
2018-08-24 14:30:39 +01:00
Mike Innes
86cf22675f
rewrite broadcast
2018-08-24 14:07:08 +01:00
Mike Innes
e13d28a7a2
cruft
2018-08-24 13:44:21 +01:00
Dhairya Gandhi
c035fe22d7
added deprecation warning
2018-08-24 13:08:03 +05:30
Yueh-Hua Tu
634d34686e
Add new constructors and test
2018-08-24 10:31:13 +08:00
Mike J Innes
953280d57f
Merge pull request #364 from boathit/master
...
fix argmax and add test
2018-08-23 15:52:06 +01:00
Mike Innes
dcde6d2217
tweaks
2018-08-23 15:44:28 +01:00
Johnny Chen
c9d6b5648f
Fix issue #354
2018-08-23 21:56:32 +08:00
Johnny Chen
6743d52d08
Fix issue #354
2018-08-23 21:34:11 +08:00
Johnny Chen
7bfe431321
Fix issue #323
2018-08-23 20:58:58 +08:00
boathit
6c97846551
rename argmax as onecold
2018-08-23 20:47:43 +08:00
Mike J Innes
6c355e93d2
Merge pull request #363 from pshashk/patch-1
...
Fix repeat
2018-08-23 11:28:13 +01:00
Mike Innes
9d1d5187f3
fix activations for 1.0
2018-08-23 10:56:31 +01:00
boathit
33c901c191
redo
2018-08-23 16:01:42 +08:00
boathit
5dca80bd68
fix argmax and batch deprecations
2018-08-23 13:17:58 +08:00
Dhairya Gandhi
2f1a9847fa
deprecate :stop from optimizers; housekeeping
2018-08-22 21:25:26 +05:30
Dhairya Gandhi
a7ad620f01
exporting stop
2018-08-22 00:33:30 +05:30
Dhairya Gandhi
3d11322d37
fixed docstring and not exporting stop
2018-08-22 00:29:07 +05:30
Dhairya Gandhi
ed044e2df7
changes as requested
2018-08-21 23:22:20 +05:30
boathit
616ed194df
fix argmax and add test
2018-08-21 11:29:57 +08:00
Mike Innes
216d278e7b
fix mnist loader
2018-08-20 16:57:43 +01:00
Mike Innes
3cfecaa4db
test cleanup
2018-08-20 15:38:25 +01:00
Mike Innes
e68b8765b6
broadcast fixes
2018-08-20 14:41:46 +01:00
pshashk
1115eda6af
repeat fix
...
ERROR: UndefVarError: A not defined
2018-08-20 16:11:56 +03:00
Dhairya Gandhi
1af7a53e1f
housekeeping: removed commented code
2018-08-20 18:10:20 +05:30
Mike Innes
5a023a9ccc
WIP 1.0 support
...
closes #353
2018-08-20 13:08:04 +01:00
Dhairya Gandhi
756207e782
added docs
2018-08-20 14:20:33 +05:30
Dhairya Gandhi
51578177a5
removed arguments from StopException
2018-08-20 14:08:23 +05:30
Dhairya Gandhi
df22bc5c8f
removed argument from stop function
2018-08-20 14:02:09 +05:30
Dhairya Gandhi
06db6ed314
housekeeping: fixing typo
2018-08-20 13:48:28 +05:30
Dhairya Gandhi
394b4167ce
moving stop to Optimise
2018-08-20 13:43:08 +05:30
Dhairya Gandhi
06aad375fc
properly importing functions
2018-08-20 13:35:55 +05:30
Dhairya Gandhi
e239eb1105
properly importing functions
2018-08-20 13:30:05 +05:30
Dhairya Gandhi
1228e9c5e2
removed include statement
2018-08-19 22:55:14 +05:30
Dhairya Gandhi
9c98272cf0
catching exception
2018-08-19 17:38:00 +05:30
Dhairya Gandhi
257e2a7d2e
checking exception
2018-08-19 17:11:11 +05:30
Dhairya Gandhi
5c42c8689c
printing expception
2018-08-19 17:04:31 +05:30
Dhairya Gandhi
b0f83f93ff
exported StopException
2018-08-19 16:41:13 +05:30
Dhairya Gandhi
a53a5c8350
exporting stop
2018-08-19 15:31:33 +05:30
Dhairya Gandhi
fbd82a6925
added end
2018-08-19 15:19:45 +05:30
Dhairya Gandhi
8229c8e045
modified training loop
2018-08-19 15:17:07 +05:30
Dhairya Gandhi
2aa057ec08
fixed throwing exception
2018-08-19 14:54:54 +05:30
Dominique Luna
f2021d41ac
initn -> init
2018-08-18 14:18:50 -04:00
Dominique Luna
3f42301e07
recurrent bug fixes
2018-08-18 11:50:52 -04:00
Dhairya Gandhi
887bfad312
returning :stop
2018-08-18 08:28:47 +05:30
Dhairya Gandhi
65a5ecccd2
returning
2018-08-18 08:24:49 +05:30
Dhairya Gandhi
999b00b64d
fixed typo
2018-08-17 19:45:10 +05:30
Dhairya Gandhi
0524964400
fixed typo
2018-08-17 19:40:48 +05:30
Dhairya Gandhi
8ad72e51ea
added function to stop training
2018-08-17 19:33:51 +05:30
Dhairya Gandhi
24a3bce495
added stop to break training loop
2018-08-17 17:46:13 +05:30
femtocleaner[bot]
2d80f68087
Fix deprecations
2018-08-14 16:46:23 +00:00
Simon
a43127f881
fix copy_transpose!
2018-08-15 12:16:12 +02:00
ayush1999
4683e925d4
Final changes
2018-08-12 11:38:48 +01:00
Josh Christie
59bdff2cae
Test 0.7 and 1.0
2018-08-11 14:58:29 +01:00
Josh Christie
c8307a0627
Use @info for logging
2018-08-11 14:42:33 +01:00
Josh Christie
710a65fe72
Fix back scalar with a Ref and fix diagonal test
2018-08-11 14:36:33 +01:00
Avik Pal
5db7a3a3ad
Fix Optimizers
2018-08-11 18:23:47 +05:30
Avik Pal
355091b9d1
Merge removing conflicts
2018-08-11 18:01:27 +05:30
Josh Christie
837e03613f
Updates for julia 1.0
2018-08-11 13:23:02 +01:00
Avik Pal
d3c78a80be
Fix layers errors
2018-08-11 17:20:27 +05:30
Avik Pal
4bd13c448f
Add updates for julia0.7
2018-08-11 15:23:40 +05:30
Josh Christie
5186e3ba18
Updates for julia 1.0
2018-08-11 10:51:07 +01:00
Avik Pal
3b448ce1ac
Merge branch 'master' into cudnn_batchnorm
2018-08-11 15:02:55 +05:30
Avik Pal
3affed8ef0
Remove track_kw
2018-08-10 03:21:05 +05:30
Mike J Innes
62d594af43
out of place gradients for collect
2018-08-07 22:09:20 +01:00
Avik Pal
a0ec472a4b
Merge branch 'master' into depthwiseconv
2018-08-08 01:20:37 +05:30
Mike J Innes
7103a0ed7d
tweaks
2018-08-03 15:19:10 +01:00
pevnak
926411a449
removed most error, the only one in Fallbacks test persits
2018-08-03 15:14:25 +01:00
pevnak
c657d4e47f
fixed the sum as suggested by mike
2018-08-03 15:14:25 +01:00
Simon Mandlik
02f343d44d
fixed more dep warns, also in tests, but maximum, minimum and size in array.jl still need to be updated. As a result, some more tests may not pass for the time being
2018-08-03 15:14:25 +01:00
Simon Mandlik
0471c489e6
depwarns
2018-08-03 15:14:25 +01:00
pevnak
3510c837a8
zeros replaced by zero
2018-08-03 15:14:25 +01:00
pevnak
ea38c7dbea
some more changes
2018-08-03 15:14:25 +01:00
pevnak
d6f5baee39
fixed fixes proposed by Carlo
2018-08-03 15:14:25 +01:00
pevnak
8ab209126d
removed zeros fix
2018-08-03 15:14:25 +01:00
pevnak
e98538673a
updated sum to be compliant with latest beta. Removed some depwarns
2018-08-03 15:14:25 +01:00
Mike J Innes
e5b3d27016
track_kw should be unnecessary
2018-08-03 15:14:10 +01:00
Avik Pal
4d17a1a809
Merge branch 'master' into depthwiseconv
2018-08-03 19:41:50 +05:30
Avik Pal
6a41f823c8
Update track function
2018-08-03 19:06:05 +05:30
Avik Pal
b4ba7df03a
Merge branch 'master' of https://github.com/FluxML/Flux.jl into cudnn_batchnorm
2018-08-03 18:55:46 +05:30
Mike Innes
f5c9361617
matmul fix
2018-08-03 13:02:47 +01:00
Mike Innes
4cf6bac0c1
fix hook
2018-08-03 13:02:47 +01:00
Mike J Innes
70718e7a64
update treelike
2018-08-03 13:02:47 +01:00
Mike J Innes
d782b33701
syntax
2018-08-03 13:02:47 +01:00
Mike J Innes
85fd77d70a
linalg deprecations
2018-08-03 13:02:47 +01:00
Mike J Innes
89872c5a8b
val deprecations
2018-08-03 13:02:47 +01:00
Mike J Innes
474f578517
ObjectIdDict -> IdDict
2018-08-03 13:02:47 +01:00
Mike J Innes
aa209ee137
no longer needed
2018-08-03 13:02:47 +01:00
Mike J Innes
00cfe24d66
fix cat
2018-08-03 13:02:47 +01:00
Mike J Innes
adc216f182
fix broadcasting
2018-08-03 12:56:32 +01:00
Mike J Innes
e486c50610
fix data
2018-08-03 12:56:31 +01:00
Mike J Innes
fb8a220659
fix matmul
2018-08-03 12:56:31 +01:00
Mike J Innes
7057ca739e
fix std usage
2018-08-03 12:56:27 +01:00
Mike J Innes
88a265154c
deprecations
2018-08-03 12:54:31 +01:00
Mike J Innes
b18b51656c
requires update
2018-08-03 12:54:24 +01:00
Mike J Innes
a49e2eae41
deprecated Void
2018-08-03 12:53:52 +01:00
Mike J Innes
1fd49c2a90
fix array show
2018-08-03 12:53:52 +01:00
Yueh-Hua Tu
5b37319289
Add Maxpool and Meanpool
2018-08-01 00:10:53 +08:00
Mike J Innes
a8ccc79f61
perf hacks
2018-07-30 20:08:44 +01:00
Avik Pal
2cc0f112f1
Updates
2018-07-27 20:12:49 +05:30
Avik Pal
7dd5ec16c9
Fix
2018-07-17 11:22:12 +05:30
Avik Pal
531ecccd38
Error statement
2018-07-17 10:14:23 +05:30
Avik Pal
4035641f00
Remove imports
2018-07-17 10:06:26 +05:30
Avik Pal
0bb3eaa1f6
Update CUDNN Batchnorm with new Flux AD
2018-07-17 09:40:20 +05:30
Avik Pal
646db81f94
Pull BatchNorm CPU updates
2018-07-17 09:24:38 +05:30
CarloLucibello
071dcdda87
update docs
2018-07-16 07:32:13 +02:00
CarloLucibello
185e9148b6
fix cpu batchnorm
2018-07-16 07:11:33 +02:00
Avik Pal
2664a16556
Update as per new AD
2018-07-13 14:12:46 +05:30
Avik Pal
0aabf9d86b
Merge branch 'master' into depthwiseconv
2018-07-13 14:04:19 +05:30
Mike J Innes
a0fd91b866
Merge pull request #307 from jarvist/master
...
Add ADAMW "Fixing Weight Decay Regularization in Adam"
2018-07-11 19:12:58 +01:00
Mike J Innes
dda51a0140
update docs
2018-07-11 15:31:22 +01:00
Mike Innes
10a169bb77
update cudnn rnn
2018-07-10 18:16:37 +01:00
Mike J Innes
70b5efeb4e
basic nested AD
2018-07-10 09:03:09 +01:00
Mike J Innes
80af9a3830
broadcast efficiency
2018-07-09 23:40:07 +01:00
Mike J Innes
e763c342ee
shave some memory
2018-07-09 19:44:14 +01:00
Mike J Innes
1430053b69
checkpoints
2018-07-09 17:52:34 +01:00
Mike J Innes
7778d17884
functional API
2018-07-09 16:57:44 +01:00
Mike J Innes
5e319c7395
fix gradient definitions
2018-07-09 13:39:10 +01:00
Mike J Innes
41b9412439
new grad api
2018-07-09 13:36:46 +01:00
Jarvist Moore Frost
344a750770
Merge branch 'master' of github.com:jarvist/Flux.jl into HEAD
2018-07-03 11:15:43 +01:00
Jarvist Moore Frost
aee4a83c55
Add ADAMW weight-decay.
...
See http://www.fast.ai/2018/07/02/adam-weight-decay/ and the original
paper https://arxiv.org/abs/1711.05101.pdf for context.
I don't know what I'm doing, and this is quite possibly wrong - but on
a simple Char-RNN I have lying around on my harddisk, this seems to
improve the rate of learning consistently for different hyperparameters
vs. standard ADAM with the same decay constant.
2018-07-03 11:11:32 +01:00
Mike J Innes
ce88273880
gradient hook
2018-07-02 13:19:13 +01:00
Mike Innes
5d8b63dc65
avoid implementation details in docs
2018-06-29 13:53:50 +01:00
Avik Pal
e3b10691d2
make cache optional param
2018-06-28 15:27:59 +05:30
Avik Pal
bcf094451c
Fix typo
2018-06-28 14:45:35 +05:30
Avik Pal
d0b79e71e2
fix load error
2018-06-28 14:27:50 +05:30
Avik Pal
7ac9e191cb
Revert 1 change
2018-06-28 14:25:22 +05:30
Avik Pal
5ccde88ce6
Minor fix for 5D support
2018-06-28 14:21:17 +05:30
Avik Pal
681d8c4dfc
Remove cache
2018-06-28 12:11:32 +05:30
Avik Pal
8f43258ab7
Get the batchnorm working without cache
2018-06-28 12:04:25 +05:30
Avik Pal
4916c8e6da
Add treelike for now
2018-06-27 14:54:49 +05:30
Matthew Kelley
864d72eef5
Overload Base.eps() for TrackedReal
2018-06-26 23:55:43 -06:00
Matthew Kelley
0e95be3326
Call Flux.Tracker.data() on ŷ for bce
2018-06-26 14:48:51 -06:00
Matthew Kelley
ed032cdb1e
Change epsilon value to eps(ŷ)
2018-06-26 12:29:06 -06:00
Matthew Kelley
e08fd7a6d2
Added epsilon term to binarycrossentropy
2018-06-26 11:43:16 -06:00
Mike J Innes
88c16e62dd
fixes #284
2018-06-26 15:09:26 +01:00
Mike J Innes
836e3872b6
style
2018-06-26 15:09:21 +01:00
Mike J Innes
2723c9ee04
Merge pull request #257 from staticfloat/sf/back_inf_nan
...
Check for `Inf` and `NaN` within `back!(::TrackedReal)`
2018-06-26 14:42:33 +01:00
Mike J Innes
0a04e3ba61
Chain activations
2018-06-26 14:30:46 +01:00
Mike J Innes
7726a5b605
inferrable
2018-06-26 14:12:57 +01:00
Mike J Innes
3b575930ca
Merge branch 'master' into scalar_pad_stride
2018-06-26 14:05:07 +01:00
Mike Innes
7e3cf45ee4
better error
2018-06-25 11:36:52 +01:00
Avik Pal
24ba1c4e6c
Make changes as per the review
2018-06-23 11:02:41 +05:30
Mike J Innes
aea1e73cde
scalar gradients
2018-06-21 13:12:42 +01:00
Avik Pal
91850a8baf
Add missing path to curnn.jl
2018-06-20 18:46:42 +05:30
Avik Pal
deb4950261
Make cuDNN take only 4D arrays
2018-06-20 15:54:38 +05:30
Avik Pal
3339ad5181
Integrate cudnn BatchNorm with Flux
2018-06-20 15:50:30 +05:30
Avik Pal
714ca23aba
Change default value of epsilon to prevent CuDNN BatchNorm warnings
2018-06-20 12:11:22 +05:30
Avik Pal
185f34d9fe
Add working backward pass
2018-06-20 12:09:54 +05:30
Avik Pal
bc47d02b3f
Remove uncessary imports
2018-06-17 12:40:01 +05:30
Avik Pal
af5ab7f9ef
Fix Tensor Descriptor Bug
2018-06-17 12:28:02 +05:30
Avik Pal
c6dcf079ce
Update file structure and make function calls correct
2018-06-17 11:47:49 +05:30
Avik Pal
24d13ac326
Fix missing parenthesis
2018-06-12 21:32:56 +05:30
Avik Pal
f12e367cab
Adding untested backward pass code
2018-06-12 18:26:09 +05:30
Avik Pal
a83e5d696d
Typo
2018-06-12 17:51:52 +05:30
Avik Pal
d4b066fdf9
Forward Pass for BatchNorm Added
2018-06-12 17:49:21 +05:30
Avik Pal
65f2c33991
Merge pull request #2 from FluxML/master
...
rebase
2018-06-11 15:40:57 +05:30
Avik Pal
b59da95786
Merge branch 'depthwiseconv' of https://github.com/avik-pal/Flux.jl into depthwiseconv
2018-06-09 13:11:42 +05:30
Avik Pal
5d7ee884b8
Fix error while backpropagatio
2018-06-09 13:04:49 +05:30
Avik Pal
7f3d11cae0
Merge branch 'master' into depthwiseconv
2018-06-09 11:06:07 +05:30
Avik Pal
1d93fb8e59
Add new constructor and fix a typo in display
2018-06-09 11:02:15 +05:30
Tejan Karmali
d20771d6be
Default value of dilation
...
dilation should be 1 by default
2018-06-09 02:29:46 +05:30
Tejan Karmali
4a24b69976
Merge branch 'master' into nadam-opt
2018-06-08 16:54:41 +05:30
Mike J Innes
4915b0c8dd
Merge pull request #268 from staticfloat/patch-2
...
Add `dilation` kwarg to `Conv`
2018-06-07 13:49:02 +01:00
Mike J Innes
af8f3348eb
Merge pull request #270 from staticfloat/sf/tracked_repeat
...
Add `TrackedArray` support for `repeat(x; inner, outer)`
2018-06-06 17:34:58 +01:00
Mike Innes
2370bdbe91
see #205
2018-06-06 17:01:28 +01:00
Avik Pal
33a7f545b7
Merge branch 'master' into depthwiseconv
2018-05-30 15:58:35 +05:30
Avik Pal
cd6a0856d5
Adds support for Depthwise Convolutions
2018-05-30 15:53:57 +05:30
staticfloat@gmail.com
f390a39d77
Add TrackedArray
support for repeat(x; inner, outer)
2018-05-22 17:41:05 -07:00
Elliot Saba
e6efca4bf4
Add dilation
kwarg to Conv
...
Now that we have dilated convolution support in `NNlib`, this is enables support in Flux's `Conv` layer.
2018-05-21 13:44:13 -07:00
James Bradbury
af12f006f2
Use broadcast for dropout
...
Should be fast enough on GPU now that it's not going to be an optimization target again for a while. Hopefully isn't meaningfully slower on CPU?
2018-05-20 04:04:33 -07:00
staticfloat@gmail.com
9fdbe843ef
Check for Inf
and NaN
within back!(::TrackedReal)
...
This is often checked for within user code, no reason to do that, let's
do it for them within `back!(::TrackedReal)`
2018-05-07 15:30:44 -07:00
Mike J Innes
24ad384a38
Merge pull request #243 from gustafsson/catdim
...
Support for hcat and cat
2018-05-07 13:04:31 +01:00
Mike Innes
ef9077d9fa
style
2018-05-07 13:03:52 +01:00
Mike Innes
b59161a41e
export Tracker again
2018-05-05 17:15:18 +01:00
Johan Gustafsson
5fc6190956
RowVector tests
2018-05-02 16:10:39 +02:00
Johan Gustafsson
94bb064a0f
more tests of array promotion for concatenation
...
# Conflicts:
# test/tracker.jl
2018-05-02 16:00:29 +02:00
Johan Gustafsson
1c189c62ed
cat with multiple dims #156
...
Co-authored-by: americast <sayan.sinha@iitkgp.ac.in>
2018-05-02 15:59:46 +02:00
Johan Gustafsson
fb68529169
define back function right after forward function
2018-05-02 15:59:46 +02:00
Johan Gustafsson
509a2e59f6
cat promotions and mixed ranks
2018-05-02 15:59:46 +02:00
Johan Gustafsson
eaaf5fd34c
vcat arrays with ndims>2
2018-05-02 15:59:46 +02:00
Johan Gustafsson
bcef5c4ab5
Support hcat and cat
2018-05-02 15:59:46 +02:00
Mike J Innes
7d7d89569c
rm this deprecation for 0.6
2018-05-01 12:20:36 +01:00
Mike J Innes
9a7e6e9c5c
hold off on some things
2018-05-01 12:18:56 +01:00
CarloLucibello
e186b958dd
more exports
2018-05-01 12:13:14 +01:00
Mike J Innes
ee89a7797e
Merge pull request #245 from freeboson/adamax
...
Add AdaMax optimizer
2018-05-01 11:28:07 +01:00
Mike J Innes
5efbaddb97
Merge pull request #249 from ninjin/nin/minimum
...
[RFC] Backpropagation for `maximum` and `minimum`
2018-04-30 18:40:42 +01:00
Mike J Innes
73a51400b6
better error message
2018-04-30 12:09:15 +01:00
Pontus Stenetorp
cfd29b9c76
Backpropagation for maximum
and minimum
2018-04-29 13:52:54 +01:00
Sujeet Akula
8c042bd522
element wise max()
2018-04-26 21:12:31 +10:00
Sujeet Akula
5e5f255f81
export typo
2018-04-26 17:42:04 +10:00
Sujeet Akula
4586bda5ab
export/test adamax
2018-04-26 17:40:11 +10:00
Sujeet Akula
b6508e2416
add adamax
2018-04-26 17:37:24 +10:00
Mike J Innes
baff20514d
gpu broadcast fix
2018-04-17 18:05:58 +01:00
Mike J Innes
8f73dc6e14
fix gpu cross entropy
2018-04-17 17:56:47 +01:00
tejank10
2ef25775c6
removed extra expand and fixed bug
2018-04-16 01:18:26 +05:30
Mike Innes
d12fb98f2a
nicer batchnorm shape error
2018-04-15 20:29:25 +01:00
tejank10
2f5473d435
added expand in conv constructor
2018-04-16 00:59:11 +05:30
Mike J Innes
8f29968c32
Merge pull request #207 from safnuk/pull-request/07b0f95d
...
BatchNorm for convolutions
2018-04-15 20:10:33 +01:00
Mike J Innes
683a73fed3
download info
2018-04-15 20:09:30 +01:00
Mike J Innes
5fd240f525
interface tweaks
2018-04-15 20:04:42 +01:00
Mike J Innes
73a0be3e04
Merge branch 'master' into pull-request/07b0f95d
2018-04-15 17:10:29 +01:00
Mike J Innes
642543808e
Merge pull request #226 from CarloLucibello/reshape
...
fix reshape
2018-04-15 16:53:21 +01:00
tejank10
b080f5c82e
Scalar pad and stride
2018-04-15 20:32:40 +05:30
Mike J Innes
cb3ae8df6a
rename normalise.jl
2018-04-15 15:45:46 +01:00
Mike J Innes
b05e755068
rm jit from cuda
2018-04-15 15:08:58 +01:00
tejank10
5cc681317a
added stride for pooling in tracker
2018-04-15 15:07:04 +01:00
tejank10
f6097d58d6
Scalar pad/stride for Conv constructor
2018-04-15 12:15:41 +05:30
Mike Innes
9d7164f15f
we'll do this differently
2018-04-14 02:09:35 +01:00
tejank10
65847bb745
moved epsilon into sqrt
2018-04-04 15:25:20 +05:30
tejank10
3ead662987
Update rule fixed
2018-04-04 15:18:44 +05:30
CarloLucibello
b415333233
fix reshape
2018-04-02 16:09:57 -04:00
tejank10
ea9b5471fa
NADAM optimizer
2018-04-03 01:27:22 +05:30
Brad Safnuk
b9a66c679d
Fix error in initialization of σ.
2018-03-22 22:20:21 -04:00
Brad Safnuk
35299d4621
Fix type instability when loading onto a gpu.
...
Also fixes Issue #216 .
2018-03-22 21:32:32 -04:00
Mike J Innes
4320738d87
fix
2018-03-21 11:25:47 +00:00
Mike Innes
1c5f8e3534
ndims for shapes
2018-03-16 14:42:08 +00:00
Brad Safnuk
db2d9efb72
Update BatchNorm documentation
2018-03-15 21:59:38 -04:00
Brad Safnuk
6653ec86d9
Allow multidimensional inputs to batchnorm.
...
Can be used in conjunction with convolutional layers, in addition
to dense layers, with the same api.
2018-03-15 21:48:59 -04:00
Mike J Innes
e931552f7d
Merge pull request #200 from chengchingwen/repmat
...
implement `back` of `repmat`
2018-03-15 15:18:48 +00:00
Mike J Innes
5d7edb5aaa
Merge pull request #197 from chengchingwen/master
...
Implement `prod` for `TrackedArray`
2018-03-15 15:17:24 +00:00
boathit
2ec37790be
eliminate ambiguity
2018-03-13 10:50:56 +08:00
boathit
ff2caf032c
eliminate ambiguous
2018-03-12 22:48:16 +08:00
Mike J Innes
9ccbac8b80
jit gpu support
2018-03-07 19:18:27 +00:00