Commit Graph

1380 Commits

Author SHA1 Message Date
Lyndon White
838047f708 fix docs 2019-03-18 12:19:44 +00:00
Kristoffer Carlsson
b84a60e74e Update src/layers/basic.jl
Co-Authored-By: oxinabox <oxinabox@ucc.asn.au>
2019-03-18 12:19:44 +00:00
Lyndon White
fcc3ec471a Add MaxOut layer 2019-03-18 12:19:44 +00:00
Lyndon White
79de829fdc move Dense's overloads to be near its defn 2019-03-18 12:18:14 +00:00
Joshua Whittemore
f061df3d23 resolves pull request #652 merge conflicts 2019-03-09 12:51:20 -08:00
Mike J Innes
b348e31f07
Merge pull request #667 from FluxML/donottrack
rm Tracker
2019-03-08 11:38:37 +00:00
Josh Whittemore
930ebaf217 Add module to make iris dataset available. 2019-03-07 16:56:23 -08:00
Manjunath Bhat
c6e51f5cc2
Made lambda and alpha of eltype(x) 2019-03-07 23:42:38 +05:30
Manjunath Bhat
47c1324476
Merge branch 'master' into patch-3 2019-03-07 23:08:40 +05:30
Manjunath Bhat
1d310d4532
Removed {typeof(p)} 2019-03-07 21:55:26 +05:30
thebhatman
f4543b7adf Value of alpha updated and dot operations changed 2019-03-08 03:21:26 +05:30
David Pollack
7b9b64f1cb change IN to in 2019-03-07 09:46:44 +01:00
David Pollack
83b4b3a714 changes based on PR comments 2019-03-07 09:46:44 +01:00
David Pollack
c41f891005 changes based on the improved batchnorm in PR#633 2019-03-07 09:46:44 +01:00
David Pollack
129a708b6f instance normalization 2019-03-07 09:46:44 +01:00
Mike J Innes
b5a148fa37 rm Tracker 2019-03-07 01:33:02 +00:00
Mike J Innes
3a4c6274fa
Merge pull request #651 from FluxML/mji/dogfood
Refactor training loop
2019-03-06 16:53:24 +00:00
Mike J Innes
fc6232b779
Merge pull request #633 from Sklan/patch-3
Improving BatchNorm
2019-03-06 16:23:03 +00:00
thebhatman
8e5965ac41 Indentation fixed 2019-03-05 16:28:05 +05:30
thebhatman
d6608682fc Suggested changes made 2019-03-05 16:18:50 +05:30
Manjunath Bhat
29b853e0bb
Made sure Gradients are not lost. 2019-03-04 22:17:19 +05:30
Manjunath Bhat
b5533ee00b
Exported AlphaDropout 2019-03-04 01:09:05 +05:30
Manjunath Bhat
97f874abcf
Added AlphaDropout which is used in SNNs. 2019-03-04 01:05:46 +05:30
Manjunath Bhat
704be49483
Added support for Float64 for DepthwiseConv
DepthwiseConv was giving errors for Float64. This fixes the issue.
2019-03-01 15:04:05 +05:30
Mike Innes
4cf43c0c41 simpler/nicer training loop 2019-02-28 14:58:42 +00:00
Mike Innes
cd091ad005 in place implicit gradients 2019-02-28 14:08:01 +00:00
Mike Innes
8b4bc7cc52 organise params 2019-02-28 13:44:54 +00:00
Dhairya Gandhi
6825639f79 mapreduce for onehotmatrix 2019-02-28 09:17:18 +05:30
Rohith Pentaparthy
1b1dff1266 Added an example of Conv to Flux.jl/src/layers/conv.jl, and clarified what WHCN means 2019-02-23 14:31:27 -06:00
Sklan
7463f09591
Update normalise.jl 2019-02-21 23:56:19 +05:30
Sklan
6044421c5c
Update normalise.jl 2019-02-20 13:47:31 +05:30
pshashk
b0a5844afb Remove dims=1 from normalise (#619)
* remove `dims=1`

* add dims arg

* fix test

* remove dims=1 only from deprecated version
2019-02-11 16:11:47 +00:00
Dhairya Gandhi
2ec35861b5 removing non-allocating functions and tests 2019-02-11 21:22:32 +05:30
Dhairya Gandhi
35cd9761a8 adding tests 2019-02-09 22:32:02 +05:30
pshashk
b074b2491a
fix docstring 2019-02-08 21:49:53 +03:00
pshashk
c3e04392d8
drop dims type restriction 2019-02-08 16:15:37 +03:00
pshashk
911c901294
dims kwarg 2019-02-08 16:00:32 +03:00
pshashk
368c29e5e3
Add corrected argument to std
Fixes ffe037c485/src/layers/stateless.jl (L49)
2019-02-08 15:23:27 +03:00
Mike J Innes
ffe037c485
Merge pull request #603 from FluxML/kf/namedtupletree
Treat NamedTuple like Tuple for treelike purposes
2019-02-08 11:06:12 +00:00
Mike J Innes
601e2d8ae0
Merge pull request #586 from KristofferC/kc/batchnorm
work around extreme slowdown in BatchNorm due to julia performance bug in broadcast fusion
2019-02-08 11:00:33 +00:00
Mike J Innes
fe712bf338
Merge pull request #596 from IvanYashchuk/ivan/topic-issue-542
Fixed issue #542.
2019-02-08 10:38:23 +00:00
Ivan Yashchuk
e00ac88016 Added tracking of logdet and logabsdet. Added gradtests. 2019-02-08 09:55:33 +02:00
Keno Fischer
1e452a3042 Treat NamedTuple like Tuple for treelike purposes 2019-02-06 11:11:00 -05:00
KristofferC
9914c531f6 work around extreme slowdown due julia performance bug 2019-02-06 16:19:29 +01:00
Mike J Innes
ecc55ec9e1
Revert "Fix OneHotVector/Matrix performance on GPU" 2019-02-06 14:31:15 +00:00
Mike J Innes
e8b2ec6f67
Merge pull request #311 from tejank10/conv_transpose
2D Conv transpose support
2019-02-06 14:14:14 +00:00
Moksh Jain
046f7b4eae fix std arguments in normalise 2019-02-05 18:36:04 +05:30
Ivan Yashchuk
f790fff59a Use other definition for grad(det(A)). 2019-02-05 14:36:28 +02:00
Moksh Jain
c6409d7686 add support for n-dimensional input to normalise layer 2019-02-05 17:09:22 +05:30
Ivan Yashchuk
aa64d2157d Fixed issue #542.
Added tracking of LinearAlgebra.det and its grad method.
2019-02-05 11:38:27 +02:00
Mike J Innes
940b1e6dbf
Merge pull request #587 from KristofferC/patch-2
use uncorrected standard deviation in normalise
2019-02-04 14:35:25 +00:00
Mike J Innes
7fc920240d
Merge pull request #591 from dhairyagandhi96/onehot
Fix OneHotVector/Matrix performance on GPU
2019-02-04 13:53:55 +00:00
Mike J Innes
17f33b4a6a
Merge pull request #583 from KristofferC/kc/small_fixes
clarify docs on single batch image to conv
2019-02-04 12:33:34 +00:00
Mike J Innes
e774053126
Merge pull request #590 from oxinabox/patch-2
Default to zero'ed initial state for all RNN
2019-02-04 12:28:38 +00:00
Mike J Innes
329c8f8f95
Merge pull request #585 from KristofferC/kc/verify_download
add hash verification to datasets
2019-02-04 11:20:53 +00:00
Mike J Innes
cfe6859186 auto-collect in forward 2019-02-04 10:37:02 +00:00
Mike J Innes
838070968e vcat with scalars 2019-02-04 00:05:16 +00:00
Dhairya Gandhi
30aa814c4d fixes #582 2019-02-03 18:43:16 +05:30
Dhairya Gandhi
e243950e28 comment fix 2019-02-03 04:00:08 +05:30
Dhairya Gandhi
bd6158d7f9 onehotvector/matrix behaviour 2019-02-03 03:57:41 +05:30
Lyndon White
26550dacda
Default to zero'ed initial state 2019-02-02 20:01:28 +00:00
Tejan Karmali
84eabcd2ae
fixed DepthwiseConv dilation 2019-02-02 12:19:35 +05:30
Tejan Karmali
e54df2de06
Merge branch 'master' into conv_transpose 2019-02-02 10:20:45 +05:30
Kristoffer Carlsson
fd0f1c7a82
use uncorrected standard deviation in normalise
fixes https://github.com/FluxML/Flux.jl/issues/529
2019-01-30 17:42:19 +01:00
Kristoffer Carlsson
f60079d07c add hash verification to datasets 2019-01-30 13:11:26 +01:00
Mike J Innes
0469394715
Merge pull request #576 from mcabbott/patch-1
PermutedDimsArray
2019-01-29 14:55:55 +00:00
Mike J Innes
9e553adbf7 add hessian 2019-01-29 08:37:30 +00:00
Michael Abbott
031d1b3d57
PermutedDimsArray like permutedims
e.g. PermutedDimsArray(rand(2,3) |> param, (2,1))
2019-01-28 18:15:32 +01:00
Mike J Innes
0f8a4a48c6 extend update! with an optimiser 2019-01-28 14:10:09 +00:00
Mike J Innes
0f2975d905 update -> apply 2019-01-28 13:59:23 +00:00
Mike J Innes
bf0b5c5cef
Merge pull request #535 from asbisen/master
fixed stack/unstack function - in utils.jl for v1.0
2019-01-28 12:23:07 +00:00
Mike Innes
af8fdcc7af fix #573 2019-01-28 10:54:58 +00:00
Mike J Innes
013b421b08
Merge pull request #570 from avik-pal/ap/batchnorm_fixes
Patches for default initializers
2019-01-28 10:40:55 +00:00
Mike Innes
1c3a63c42f fixes #574 2019-01-28 10:11:07 +00:00
Mike J Innes
58ac415f6b forward mode 2019-01-25 16:14:24 +00:00
Mike J Innes
2b1a3e92da mapparams 2019-01-25 10:11:46 +00:00
Mike J Innes
791939709b numeric precision utilities 2019-01-25 10:06:37 +00:00
Mike J Innes
1cf37ab9eb rm some old deprecations 2019-01-25 09:54:32 +00:00
Avik Pal
733879681e Change initializer to glorot_uniform 2019-01-24 18:48:30 +05:30
Avik Pal
bb72c528e1 Change initializers to Float32 2019-01-24 18:43:39 +05:30
Mike Innes
ca1c73ed35 fixup 2019-01-24 11:15:57 +00:00
Kristoffer Carlsson
325e3a4f70 clarify docs on single batch image to conv
fixes #309
2019-01-24 11:24:10 +01:00
Mike J Innes
62d780c77f onecold fix 2019-01-24 10:16:41 +00:00
Dhairya Gandhi
4be08fe194 remove debug statement 2019-01-22 17:29:12 +05:30
Mike J Innes
152ce4a164 conversions for dual numbers 2019-01-22 10:07:42 +00:00
Mike J Innes
496dbfabd2 make chain collectable 2019-01-22 00:31:55 +00:00
Mike J Innes
f6397e7358
Merge pull request #517 from FluxML/fix_adamw
Fix decay argument in ADAMW
2019-01-18 10:06:23 +00:00
Mike J Innes
058b4dc7fb
Merge pull request #557 from dhairyagandhi96/dg/transpose
fix transpose/ adjoint gradient
2019-01-16 15:46:44 +00:00
Mike J Innes
9d56807bcd cuarrays version check 2019-01-15 11:43:57 -05:00
Dhairya Gandhi
0060cc3453 fixes transpose/ adjoint gradient 2019-01-15 21:59:32 +05:30
Mike J Innes
a3e0de1ee5 fixes #516 2019-01-15 15:49:18 +00:00
Mike J Innes
67d9016319
Merge pull request #538 from KristofferC/kc/promote
fix promotion by avoiding integer division in mse and crossentropy
2019-01-15 13:20:46 +00:00
Kristoffer Carlsson
c74aa67c5d fix promotion by avoiding integer division in mse and crossentropy
oops

add tests
2019-01-15 14:15:05 +01:00
Mike J Innes
827a7b8ed5
Merge pull request #546 from ChrisRackauckas/random
Support random numbers as constants
2019-01-11 10:06:54 +00:00
Mike J Innes
aa1b4f410f simplify 2019-01-11 10:06:14 +00:00
Christopher Rackauckas
f6faa10ee2 remove non-type dispatches 2019-01-10 08:57:10 -08:00
Mike J Innes
f0d5624ed2
Merge pull request #493 from dhairyagandhi96/master
[WIP] New Optimiser Docs
2019-01-10 11:10:38 +00:00
Dhairya Gandhi
4291c1a833 pull master 2019-01-10 16:35:57 +05:30
Mike J Innes
e6f925f977 train docstring simplification 2019-01-10 11:05:21 +00:00
Dhairya Gandhi
f00e1cdedf [docs] replace :stop with Flux.stop() 2019-01-10 16:34:07 +05:30