Dhairya Gandhi
d8394298bb
fix merge conflicts
2018-10-11 10:15:59 +05:30
Dhairya Gandhi
fe8c147f72
fixed weight decay definition
2018-10-11 10:07:16 +05:30
Mike J Innes
ab0763fd41
Merge pull request #428 from tejank10/rnn-fixes
...
[WIP] Fixes for RNN tests
2018-10-10 16:58:44 +01:00
Tejan Karmali
8987e2c423
rm comments
2018-10-10 11:55:10 -04:00
Tejan Karmali
6b4bbd4fce
reverted back the weights changes in rnndesc
2018-10-10 10:29:15 -04:00
Mike J Innes
9f6c3d5a2c
fixes #403
2018-10-10 12:26:03 +01:00
Tejan Karmali
7b3e9c35ad
changed index to view
2018-10-09 12:57:20 -04:00
Mike J Innes
3285afa45a
Merge pull request #409 from harryscholes/patch-2
...
Correct Custom Gradients docs
2018-10-09 14:09:09 +01:00
harryscholes
61c14afee4
Add usage example of custom gradients
2018-10-09 13:05:38 +01:00
Mike J Innes
5d3cc044cd
Merge pull request #427 from johnnychen94/master
...
Support copy(::TrackedArray)
2018-10-08 23:36:03 +01:00
JohnnyChen
de7623ac94
use variable assignment to do "copy"
2018-10-09 03:49:17 +08:00
JohnnyChen
eaacec852f
Bug fix
2018-10-09 03:40:02 +08:00
JohnnyChen
27fec15fcc
Add explicit copy(x::TrackedArray) method
2018-10-09 03:34:41 +08:00
Tejan Karmali
4d1a6c305b
fixed params getting zero
2018-10-08 13:59:29 -04:00
JohnnyChen
36f5f274a5
Support copy(::TrackedArray)
...
1. fix issue https://github.com/FluxML/Flux.jl/issues/416
2. change test code to pass the test: some broken tests are not broken now...
2018-10-09 01:53:32 +08:00
Avik Pal
9bd2c4e006
Update curnn.jl
2018-10-06 00:00:46 +05:30
Avik Pal
d56c626725
Merge branch 'master' into cudnn_batchnorm
2018-10-06 00:00:16 +05:30
Mike J Innes
73385b5dbd
Merge pull request #372 from johnnychen94/issue-#354
...
Type restriction for Dense layer
2018-10-05 15:03:03 +01:00
Proyag
3b391a1af6
#389
2018-10-05 14:47:06 +01:00
Mike Innes
c6740c5cdd
fix unbroadcast
2018-10-05 14:14:43 +01:00
Mike J Innes
325d2ce212
Merge pull request #418 from c-p-murphy/add-fashion-mnist
...
Add FashionMNIST
2018-10-05 14:05:50 +01:00
Mike Innes
61fb6cdf05
jit macro
2018-10-05 14:02:00 +01:00
Mike Innes
69afdd61a6
avoid a warning
2018-10-05 13:59:58 +01:00
Mike Innes
bfe85e65f1
compose tweaks
2018-10-05 13:52:26 +01:00
Mike Innes
0f2019eba5
compose tweaks
2018-10-05 12:57:03 +01:00
Mike Innes
9bc9771a8d
tweaks
2018-10-05 12:43:03 +01:00
Mike Innes
4abe518599
newline fixes
2018-10-05 12:37:47 +01:00
Mike J Innes
f08b6f80d2
Merge pull request #422 from tejank10/cudnn_avail
...
cudnn_available update
2018-10-05 12:05:18 +01:00
Tejan Karmali
2ff54ee0fd
cudnn_available() update
2018-10-04 11:31:29 -04:00
Christopher Murphy
73a526b1de
reuse utils from mnist.jl
2018-10-03 12:40:24 -04:00
Mike J Innes
683bbec71c
Merge pull request #413 from mcabbott/patch-2
...
evaluate both 2-ary DiffRules only when needed
2018-10-03 12:02:12 +01:00
Mike J Innes
fe6793fde5
closes #411
2018-10-03 11:45:29 +01:00
Mike J Innes
3a7b77d104
Merge pull request #419 from r3tex/master
...
update utils.jl for 1.0
2018-10-03 11:21:40 +01:00
Robert Luciani
252e34e173
1.0+ updates - indices to axes, Vector init with undef
2018-10-02 21:39:00 +02:00
Christopher Murphy
95d72d7f79
update comments
2018-10-02 15:31:44 -04:00
Christopher Murphy
7e67bf06e1
update tests
2018-10-02 15:00:45 -04:00
Christopher Murphy
aff4c7898e
add FashionMNIST
2018-10-01 15:26:26 -04:00
Avik Pal
f3e39a1e55
Merge branch 'master' of https://github.com/FluxML/Flux.jl
2018-10-01 09:50:30 +05:30
Dhairya Gandhi
b661db3797
added deprecations and compose
2018-10-01 05:30:53 +05:30
Michael Abbott
d25e05d9ee
evaluate both 2-ary DiffRules only when needed
2018-09-27 10:40:44 +02:00
JohnnyChen
3bf18347e0
Fix dimensional error in test
2018-09-26 22:03:38 +08:00
JohnnyChen
b20ae0546b
rebase to pass the test
2018-09-26 20:30:13 +08:00
Harry
179a1e8407
Correct Custom Gradients docs
...
* Fixed a type signature that was incorrect.
* Also, replaced `data(a)` with `a.data`. Don't know if the syntax has changed (recently). This may also need to be corrected in line 121.
MWE:
```julia
using Flux
using Flux.Tracker
using Flux.Tracker: forward, TrackedReal, track, @grad
minus(a, b) = a - b
minus(a::TrackedReal, b::TrackedReal) = Tracker.track(minus, a, b)
@grad function minus(a, b)
return minus(a.data, b.data), Δ -> (Δ, -Δ)
end
a, b = param(2), param(4)
c = minus(a, b) # -2.0 (tracked)
Tracker.back!(c)
Tracker.grad(a) # 1.00
Tracker.grad(b) # -1.00
```
2018-09-21 16:57:54 +01:00
Mike J Innes
02ecca4c61
Merge pull request #405 from harryscholes/patch-1
...
Fix typo
2018-09-19 17:02:26 +01:00
Harry
079614adb2
Fix typo
2018-09-19 16:45:11 +01:00
Mike J Innes
6367cfd696
Merge pull request #404 from ornithos/add-inv-funcs
...
add inv/ldivide/rdivide + test
2018-09-19 15:32:49 +01:00
Alex Bird
d131853587
add inv/ldivide/rdivide + test
2018-09-19 13:08:30 +01:00
Mike J Innes
b3a08baf55
Merge pull request #400 from IsaacTay/patch-1
...
updated loadparams! function
2018-09-17 00:03:07 +01:00
Dhairya Gandhi
87c7e65a2d
fixed Compose test
2018-09-16 17:45:29 +05:30
Dhairya Gandhi
6665189ff1
added remaining optimizers and tests
2018-09-16 17:34:51 +05:30