Mike J Innes
ab0763fd41
Merge pull request #428 from tejank10/rnn-fixes
...
[WIP] Fixes for RNN tests
2018-10-10 16:58:44 +01:00
Tejan Karmali
8987e2c423
rm comments
2018-10-10 11:55:10 -04:00
Tejan Karmali
6b4bbd4fce
reverted back the weights changes in rnndesc
2018-10-10 10:29:15 -04:00
Mike J Innes
9f6c3d5a2c
fixes #403
2018-10-10 12:26:03 +01:00
Tejan Karmali
7b3e9c35ad
changed index to view
2018-10-09 12:57:20 -04:00
Mike J Innes
3285afa45a
Merge pull request #409 from harryscholes/patch-2
...
Correct Custom Gradients docs
2018-10-09 14:09:09 +01:00
harryscholes
61c14afee4
Add usage example of custom gradients
2018-10-09 13:05:38 +01:00
Mike J Innes
5d3cc044cd
Merge pull request #427 from johnnychen94/master
...
Support copy(::TrackedArray)
2018-10-08 23:36:03 +01:00
JohnnyChen
de7623ac94
use variable assignment to do "copy"
2018-10-09 03:49:17 +08:00
JohnnyChen
eaacec852f
Bug fix
2018-10-09 03:40:02 +08:00
JohnnyChen
27fec15fcc
Add explicit copy(x::TrackedArray) method
2018-10-09 03:34:41 +08:00
Tejan Karmali
4d1a6c305b
fixed params getting zero
2018-10-08 13:59:29 -04:00
JohnnyChen
36f5f274a5
Support copy(::TrackedArray)
...
1. fix issue https://github.com/FluxML/Flux.jl/issues/416
2. change test code to pass the test: some broken tests are not broken now...
2018-10-09 01:53:32 +08:00
Mike J Innes
73385b5dbd
Merge pull request #372 from johnnychen94/issue-#354
...
Type restriction for Dense layer
2018-10-05 15:03:03 +01:00
Proyag
3b391a1af6
#389
2018-10-05 14:47:06 +01:00
Mike Innes
c6740c5cdd
fix unbroadcast
2018-10-05 14:14:43 +01:00
Mike J Innes
325d2ce212
Merge pull request #418 from c-p-murphy/add-fashion-mnist
...
Add FashionMNIST
2018-10-05 14:05:50 +01:00
Mike Innes
61fb6cdf05
jit macro
2018-10-05 14:02:00 +01:00
Mike Innes
69afdd61a6
avoid a warning
2018-10-05 13:59:58 +01:00
Mike J Innes
f08b6f80d2
Merge pull request #422 from tejank10/cudnn_avail
...
cudnn_available update
2018-10-05 12:05:18 +01:00
Tejan Karmali
2ff54ee0fd
cudnn_available() update
2018-10-04 11:31:29 -04:00
Christopher Murphy
73a526b1de
reuse utils from mnist.jl
2018-10-03 12:40:24 -04:00
Mike J Innes
683bbec71c
Merge pull request #413 from mcabbott/patch-2
...
evaluate both 2-ary DiffRules only when needed
2018-10-03 12:02:12 +01:00
Mike J Innes
fe6793fde5
closes #411
2018-10-03 11:45:29 +01:00
Mike J Innes
3a7b77d104
Merge pull request #419 from r3tex/master
...
update utils.jl for 1.0
2018-10-03 11:21:40 +01:00
Robert Luciani
252e34e173
1.0+ updates - indices to axes, Vector init with undef
2018-10-02 21:39:00 +02:00
Christopher Murphy
95d72d7f79
update comments
2018-10-02 15:31:44 -04:00
Christopher Murphy
7e67bf06e1
update tests
2018-10-02 15:00:45 -04:00
Christopher Murphy
aff4c7898e
add FashionMNIST
2018-10-01 15:26:26 -04:00
Michael Abbott
d25e05d9ee
evaluate both 2-ary DiffRules only when needed
2018-09-27 10:40:44 +02:00
JohnnyChen
3bf18347e0
Fix dimensional error in test
2018-09-26 22:03:38 +08:00
JohnnyChen
b20ae0546b
rebase to pass the test
2018-09-26 20:30:13 +08:00
Harry
179a1e8407
Correct Custom Gradients docs
...
* Fixed a type signature that was incorrect.
* Also, replaced `data(a)` with `a.data`. Don't know if the syntax has changed (recently). This may also need to be corrected in line 121.
MWE:
```julia
using Flux
using Flux.Tracker
using Flux.Tracker: forward, TrackedReal, track, @grad
minus(a, b) = a - b
minus(a::TrackedReal, b::TrackedReal) = Tracker.track(minus, a, b)
@grad function minus(a, b)
return minus(a.data, b.data), Δ -> (Δ, -Δ)
end
a, b = param(2), param(4)
c = minus(a, b) # -2.0 (tracked)
Tracker.back!(c)
Tracker.grad(a) # 1.00
Tracker.grad(b) # -1.00
```
2018-09-21 16:57:54 +01:00
Mike J Innes
02ecca4c61
Merge pull request #405 from harryscholes/patch-1
...
Fix typo
2018-09-19 17:02:26 +01:00
Harry
079614adb2
Fix typo
2018-09-19 16:45:11 +01:00
Mike J Innes
6367cfd696
Merge pull request #404 from ornithos/add-inv-funcs
...
add inv/ldivide/rdivide + test
2018-09-19 15:32:49 +01:00
Alex Bird
d131853587
add inv/ldivide/rdivide + test
2018-09-19 13:08:30 +01:00
Mike J Innes
b3a08baf55
Merge pull request #400 from IsaacTay/patch-1
...
updated loadparams! function
2018-09-17 00:03:07 +01:00
Isaac Tay
e803117e25
updated loadparams! function
2018-09-15 16:45:04 +08:00
Mike J Innes
9d4ee1b3aa
Merge pull request #394 from sambitdash/patch-1
...
The sample gradient should not use the softdash
2018-09-14 20:24:07 +01:00
Mike J Innes
08fb9b7df1
Merge pull request #397 from FluxML/nest-bcast
...
Nested Derivatives of Broadcast
2018-09-14 20:23:28 +01:00
Mike Innes
d797999fc5
fix sentiment model
2018-09-14 18:10:24 +01:00
Sambit Kumar Dash
8b9a98ed01
The sample gradient should not use the softdash
...
While softdash is a very natural and mathematical way of representation, it can be very easily confused with the apostrophe used for LinAlg adjoint. Not worth and unnecessary confusion in a first example of the code.
2018-09-11 18:58:07 +05:30
Mike J Innes
b93d4763cc
Merge pull request #391 from jekbradbury/normalise-1
...
1.0 compat for `normalise`
2018-09-07 11:01:23 +01:00
James Bradbury
e7783ace12
1.0 compat for normalise
2018-09-06 18:38:11 -07:00
Mike J Innes
6bbed07e96
enable nested broadcast
2018-09-07 02:05:03 +01:00
Johnny Chen
44049ce00c
Merge branch 'master' into issue-#354
2018-09-06 09:39:31 -05:00
Mike J Innes
5e4ee827e9
Merge pull request #371 from johnnychen94/issue-#323
...
Fix issue #323
2018-09-06 15:28:15 +01:00
Mike J Innes
395a35d137
better headings
2018-09-05 17:03:41 +01:00
Mike J Innes
193c4ded19
make docs on 1.0
2018-09-05 16:52:50 +01:00