Michael Abbott
806e0c5c57
line
2019-09-25 15:20:13 +02:00
Michael Abbott
4245d9acad
eg
2019-09-25 15:18:40 +02:00
Michael Abbott
2de84ce79f
simplify
2019-09-25 13:59:32 +02:00
Michael Abbott
1a1a96571a
+Chain
2019-09-25 13:47:29 +02:00
Michael Abbott
19830c71b1
fix printing of SkipConnection
2019-09-25 13:37:01 +02:00
bors[bot]
acb6a89245
Merge #865
...
865: Functor r=MikeInnes a=MikeInnes
This refactors our current `@treelike` infrastructure. It somewhat formalises what we're doing around the idea of a Flux model as a functor, i.e. something that can be mapped over.
This is much more flexible than what we had before, and avoids some issues. It allows layers to have state that isn't mappable; it allows for dispatch when walking the tree, which means layers like `BatchNorm` can have non-trainable parameters; and it also allows for zipped mapping like `fmap(+, xs, ys)`, which isn't implemented yet but will be useful for the new optimisers work.
The main downside is that the term `functor` has been previously used in the Julia community as a malapropism for "thing that behaves like a function"; but hopefully this can start to reduce that usage.
Co-authored-by: Mike Innes <mike.j.innes@gmail.com>
2019-09-24 16:36:10 +00:00
Dhairya Gandhi
822288d63d
merge conflicts
2019-09-24 00:31:44 +05:30
Dhairya Gandhi
6846551f57
fix cuda init
2019-09-22 22:02:05 +05:30
Mike Innes
b60df53ba1
pkg up
2019-09-19 18:33:33 +01:00
Mike Innes
cabb81e30b
internal rename
2019-09-19 15:53:31 +01:00
Mike Innes
b951377426
fix normalisation layer params
2019-09-19 15:33:24 +01:00
Mike Innes
6529dbcbe6
functor refactor
2019-09-19 15:22:11 +01:00
Mike Innes
2c71fc282b
rename functor.jl
2019-09-19 14:15:28 +01:00
Mike Innes
c5e56b7e04
move setweights and copy_transpose
2019-09-17 17:22:35 +01:00
Mike Innes
5baebf48f4
Merge branch 'master' into tb/cuarrays_dnn
2019-09-17 16:17:09 +01:00
Mike Innes
368b1f53b4
tuple support
2019-09-17 15:49:39 +01:00
Mike Innes
b348b20452
cudnn rnns + implicit gradients
2019-09-17 15:41:42 +01:00
Mike Innes
fe57215b7e
test fillarray gradients
2019-09-17 15:21:03 +01:00
Tim Besard
4942d7fcfd
Move functionality over to CuArrays.
2019-09-13 08:21:45 +02:00
Tim Besard
1e7ff4f65d
Query the worksize.
2019-09-13 08:04:05 +02:00
Tim Besard
04fce70019
Move low-level CUDNN wrappers to CuArrays.
2019-09-13 08:04:05 +02:00
Mike Innes
de2049450b
docs mostly fixed
2019-09-10 15:17:07 +01:00
Mike Innes
c8d460ff84
doctests passing
2019-09-10 15:02:43 +01:00
Mike J Innes
67c38b3099
Merge branch 'master' into zygote
2019-09-06 15:18:58 +01:00
thebhatman
ecc9ce9d64
Gradient on AlphaDropout now working
2019-09-06 16:34:19 +05:30
Mike J Innes
3c1ac84676
Merge pull request #842 from baggepinnen/patch-4
...
Add RADAM optimizer
2019-09-02 14:36:40 +01:00
Manjunath Bhat
c3cc4bf966
Remove double docstring
2019-08-31 01:35:40 +05:30
thebhatman
2f1a187665
Update AlphaDropout
2019-08-31 01:28:58 +05:30
Fredrik Bagge Carlson
cb3bfd72f3
Export RADAM from Optimise
2019-08-29 07:46:45 +08:00
Mike J Innes
9cd97f06f7
define has_cuarrays when no cuda
2019-08-27 15:06:04 +01:00
Tim Besard
4fef9d8508
Don't depend on unreleased CuArrays.
2019-08-27 09:40:22 +02:00
Tim Besard
6ad3cdd138
Replace Requires with direct CuArrays dependency.
2019-08-27 09:33:15 +02:00
janEbert
dec1b37e8e
Merge remote-tracking branch 'origin/master' into HEAD
2019-08-24 12:23:10 +02:00
janEbert
978d7bf195
Fix CuArrays.libcudnn imports
2019-08-24 02:21:54 +02:00
Mike Innes
487000ac31
fix cuda code and tests
2019-08-19 16:56:48 +01:00
Mike Innes
6c67404398
update cleanup
2019-08-19 15:44:51 +01:00
Mike Innes
447fd9d604
conv docstring formatting
2019-08-19 15:30:59 +01:00
Mike Innes
2f7ad895aa
test cleanups
2019-08-19 15:22:50 +01:00
Mike Innes
9590aa63e3
rm last uses of param/data
2019-08-19 15:14:42 +01:00
thebhatman
a76e4d128b
Remove param from crosscor
2019-08-19 19:19:53 +05:30
Manjunath Bhat
8456b7ba45
Remove param from groupnorm
2019-08-19 19:16:21 +05:30
Mike Innes
3ecca436e4
formatting fix
2019-08-19 14:42:07 +01:00
Mike Innes
49044dff7c
avoid adjoint on abstract type
2019-08-19 14:39:09 +01:00
Mike Innes
b8fabad337
deprecate param/data
2019-08-19 14:35:48 +01:00
Fredrik Bagge Carlson
3287cf23db
Add RADAM export
2019-08-19 13:07:39 +08:00
Fredrik Bagge Carlson
ebbad0d135
Add RADAM optimizer
2019-08-19 12:22:32 +08:00
Miguel Madrid Mencía
14affbc91b
Use CuArrays.ones
instead cuones
which is deprecated
2019-08-11 13:38:44 +02:00
Mike J Innes
7c111e7cde
fixes #645
...
fixes #831
2019-08-09 13:53:11 +01:00
Moelf
4d00957b36
Fix CuArray zeros deprecation
2019-08-06 22:23:21 +02:00
Christopher Rackauckas
ed12d4e7c0
Momentum doesn't need params
2019-07-31 17:56:51 -04:00
Mike J Innes
f3551da5a2
dropout printing
2019-07-24 11:20:39 -04:00
thebhatman
faac0ff08b
Updated InstanceNorm and GroupNorm to avoid mutation
2019-07-18 16:13:58 +05:30
Manjunath Bhat
b779d43aca
replaced trunc Int with div
2019-07-16 17:52:55 +05:30
thebhatman
2816fbb9b2
Fix for getindex error in BatchNorm
2019-07-12 22:19:41 +05:30
Mike Innes
a140c31f72
fix batchnorm
2019-07-12 16:09:42 +01:00
Mike Innes
1fc584102d
fix dropout
2019-07-12 15:38:28 +01:00
Mike Innes
e2bf46b7fd
gpu test fixes
2019-07-12 14:52:01 +01:00
Mike Innes
33c8d84a60
cuparam -> cuarray
2019-07-11 14:14:56 +01:00
Manjunath Bhat
11c9a8450c
Remove active from GroupNorm
2019-07-11 18:40:48 +05:30
Mike Innes
c2cd7dab91
re-export gradient
2019-07-11 13:55:12 +01:00
DrChainsaw
16d5f2bc24
Add x to seen in prefor to avoid infinite recursion if passed something self-referential
2019-07-08 23:11:35 +02:00
thebhatman
cf5bc801d3
Check for nothing in update step
2019-07-08 19:22:23 +05:30
thebhatman
8d78b437ff
Merge branch 'sf/zygote_updated' of https://github.com/thebhatman/Flux.jl
2019-07-08 18:47:17 +05:30
thebhatman
812541f8d6
zeros replaced by fill to avoid nothing grad
2019-07-06 19:41:03 +05:30
thebhatman
3ee2a76f61
Removed .data from LSTMCell
2019-07-02 17:38:30 +05:30
thebhatman
b194e7e3a8
Callback being called now
2019-06-20 00:37:54 +05:30
Dhairya Gandhi
dd9cdbef14
remove uncessary call to beta
2019-06-16 19:09:50 +05:30
Dhairya Gandhi
67f18663d9
pick beta from state in NADAM
2019-06-16 19:06:59 +05:30
thebhatman
7ab9d8ed3d
Minor update
2019-06-13 18:59:03 +05:30
thebhatman
ce11804dc1
CrossCor test passing, hopefully.
2019-06-13 01:21:58 +05:30
thebhatman
48ed93cdaa
Silly error in Dropout corrected.
2019-06-12 23:16:15 +05:30
thebhatman
e9797408ec
DepthwiseConv corrected again.
2019-06-12 23:01:51 +05:30
thebhatman
00a4f4c26d
Correcting Dropout
2019-06-12 22:39:30 +05:30
thebhatman
bd7e3b1f41
Dropout with dims test passing.
2019-06-12 22:16:11 +05:30
thebhatman
c7c0ee2cbc
Resolving Merge Conflicts
2019-06-12 21:34:42 +05:30
thebhatman
dfd2965e85
GroupNorm tests corrected
2019-06-11 22:32:54 +05:30
thebhatman
11073dcd25
GroupNorm made to use istraining()
2019-06-11 22:04:33 +05:30
thebhatman
ef63f80644
No ops defined for param and data
2019-06-10 18:24:18 +05:30
Mike J Innes
b98075817c
Merge branch 'master' into DenseBlock
2019-06-05 14:27:47 +01:00
ayush-1506
2161163a82
added crosscor
2019-05-14 02:52:28 -07:00
Bruno Hebling Vieira
796a2957c9
Added news and removed type annotation from SkipConnection structure
2019-05-13 16:33:31 -03:00
Bruno Hebling Vieira
e7d76b8423
Added the SkipConnection layer and constructor
...
Added missing export
Corrected channel placement
Dimension 4 cannot be assumed to always be the Channel dimension
Deprecation of `treelike`
Code now makes use of `@treelike` macro instead of the deprecated `treelike` function (it worked on my end because I'm on Julia 0.7, while Julia 1.0 deprecated stuff)
Update basic.jl
Renaming to SkipConnection
* Update Flux.jl
* Update basic.jl
Updated `SkipConnection` with a `connection` field
I'm pretty sure I broke something now, but this PR should follow along these lines `cat` needs special treatment (the user can declare his own `concatenate` connection, but I foresee it's going to be used often so we can simply define special treatment)
Forgot to remove some rebasing text
Forgot to remove some more rebasing text
Removed local copy and default cat method from the function calls
Adjusted some more types for inference, could improve on this as well
Re-placed some left-over spaces
2019-05-13 16:32:00 -03:00
bors[bot]
68ba6e4e2f
Merge #563
...
563: noise shape for dropout r=MikeInnes a=chengchingwen
I add the noise shape for dropout, similar to the `noise_shape` argument in [`tf.nn.dropout`](https://www.tensorflow.org/api_docs/python/tf/nn/dropout )
Co-authored-by: chengchingwen <adgjl5645@hotmail.com>
Co-authored-by: Peter <adgjl5645@hotmail.com>
2019-05-13 17:16:10 +00:00
chengchingwen
2fc2a5282c
Merge remote-tracking branch 'upstream/master' into drop_shape
2019-05-14 00:50:59 +08:00
Elliot Saba
2e6561bb6a
Change DepthwiseConv()
to use in=>out
instead of in=>mult
.
...
This is an API change, but I think it makes more sense, and is more
consistent with our `Conv()` api.
2019-05-12 11:20:24 -07:00
chengchingwen
5c5140683c
make dims as field of Dropout
2019-05-10 23:45:50 +08:00
Mike J Innes
92ddc618f8
update for arrays
2019-05-02 18:57:52 -07:00
Mike J Innes
c70276ddfe
rm more deprecations
2019-05-02 18:57:52 -07:00
Mike J Innes
256695262c
rm optimiser deprecations
2019-05-02 18:54:01 -07:00
Mike J Innes
82ee61f5be
implement #643
2019-05-02 18:52:09 -07:00
Mike J Innes
c313be8e95
rm data/param
2019-05-02 18:52:09 -07:00
Mike J Innes
aa4d221f8c
break all the things
2019-05-02 18:50:52 -07:00
Avik Pal
a0be6fa837
Add missing activation function for batchnorm
2019-05-01 19:47:54 +05:30
Dhairya Gandhi
221670a2b1
Merge pull request #733 from thebhatman/expdecay-fix
...
Fixed ExpDecay
2019-05-01 18:58:37 +05:30
Dhairya Gandhi
9bbbd17e4b
Merge branch 'master' into onecold
2019-04-30 19:09:36 +05:30
Roger-luo
d63338c242
fix doctest
2019-04-26 18:12:14 +08:00
Mike J Innes
6c3a939133
Update src/onehot.jl
...
Co-Authored-By: Roger-luo <hiroger@qq.com>
2019-04-26 18:09:14 +08:00
Roger-luo
fabcd05ff2
add examples
2019-04-26 18:05:03 +08:00
Elliot Saba
732f97fe16
Split out conv_transpose_dims()
so that Zygote can ignore it
2019-04-25 10:24:19 -07:00
Elliot Saba
6e22cd4931
Add asymmetric padding to convolutional layers
2019-04-25 09:55:23 -07:00
Elliot Saba
113ddc8760
Update Flux
code for new NNlib branch
2019-04-25 09:55:23 -07:00
Hossein Pourbozorg
7f06b15f67
use https instead of http for web links
2019-04-25 11:04:03 +00:00
Jake Topping
ff7adda74b
Swap comma for full stop
...
"ERROR: LoadError: UndefVarError: G not defined" caused by "gn,G" rather than "gn.G" in line 386. Swapping for full stop should fix this
2019-04-22 17:08:36 +01:00
Zachary P Christensen
83eb5a1df6
Fix typo in Maxout
2019-04-19 17:02:26 -04:00
thebhatman
31a50ab16a
Fixed ExpDecay
2019-04-11 17:28:06 +05:30
Mike J Innes
54d9229be9
Merge pull request #710 from johnnychen94/master
...
naive implementation of activations
2019-04-05 15:33:31 +01:00
Johnny Chen
a300376f71
fix a typo in comment
...
`inplementation` --> `implementation`
2019-04-05 19:19:30 +08:00
JohnnyChen
3cafbbad02
simplify the implementation
2019-04-05 18:44:00 +08:00
JohnnyChen
de7a5f4024
correct the function behavior; support Any type
2019-04-05 18:16:44 +08:00
thebhatman
b84ab7ac95
Removed logcosh
2019-04-05 03:16:54 +05:30
bors[bot]
bd9d73a941
Merge #655
...
655: Added support for Float64 for DepthwiseConv r=dhairyagandhi96 a=thebhatman
DepthwiseConv was giving errors for Float64. This fixes the issue.
Co-authored-by: Manjunath Bhat <manjunathbhat9920@gmail.com>
2019-04-04 17:25:52 +00:00
chengchingwen
261235311c
change dims
as unbroadcasted dims and keyword argument
2019-04-05 01:19:20 +08:00
Dhairya Gandhi
1963f30911
Merge pull request #726 from dhairyagandhi96/iris
...
use cached iris dataset
2019-04-04 22:46:21 +05:30
Dhairya Gandhi
9c8175b1c0
fixes
2019-04-04 22:32:01 +05:30
Dhairya Gandhi
4f754d33cb
switch to http link
2019-04-04 22:18:38 +05:30
Dhairya Gandhi
38cc216a4b
switch to azure
2019-04-04 22:03:01 +05:30
Dhairya Gandhi
77274b4af7
change iris link
2019-04-04 21:07:46 +05:30
Dhairya Gandhi
2952bcdab1
fixes
2019-04-04 19:28:40 +05:30
Dhairya Gandhi
5b9c53439b
recreate OHV
2019-04-04 19:19:47 +05:30
Dhairya Gandhi
4f1336905f
fix colon indexing
2019-04-04 19:16:14 +05:30
Shreyas
4cb7b9278b
Minor changes to docstring according to guidelines
2019-03-30 00:28:23 +05:30
JohnnyChen
82595648e2
change 4-spaces tab to 2-spaces tab
2019-03-28 22:40:24 +08:00
Shreyas
b6fcd1d837
Added export to Maxout
2019-03-28 19:15:16 +05:30
JohnnyChen
13c58494ec
add x into results
2019-03-28 19:28:59 +08:00
JohnnyChen
5c2a071713
add support for 0-element Chain
2019-03-28 17:20:41 +08:00
JohnnyChen
ccfe0f8720
naive implementation of activations
2019-03-28 17:07:04 +08:00
Shreyas
61c1fbd013
Made Requested Changes
2019-03-28 01:33:04 +05:30
Shreyas
671aed963e
Made a few fixes. Added tests
2019-03-28 00:51:50 +05:30
thebhatman
4efcc69ba5
logcosh averaged
2019-03-26 23:23:02 +05:30
Shreyas
595f1cf6eb
Made Requested Changes
2019-03-26 21:42:49 +05:30
Manjunath Bhat
930adb122d
Avoided promotion to Float64 in hinge.
2019-03-25 23:43:06 +05:30
thebhatman
6f078857be
Added reference links to loss functions
2019-03-26 03:15:28 +05:30
thebhatman
c4d12e57fe
Loss function names in lowercase
2019-03-26 03:09:48 +05:30
Lyndon White
f0cc4a328d
make Maxout trainable
2019-03-25 16:02:46 +00:00
Tim Besard
0734eeb50e
Check CuArrays major version.
2019-03-22 14:15:26 +01:00
Dhairya Gandhi
bc06861320
fix indirect import
2019-03-22 14:15:26 +01:00
Tim Besard
959dd247bf
Import CUDAdrv stuff through CuArrays.
2019-03-22 14:15:26 +01:00
Tim Besard
df509ce9f0
Adapt to the new CUDAdrv.CuPtr pointer type.
2019-03-22 14:15:26 +01:00
Mike J Innes
b637311642
Merge pull request #647 from oxinabox/ox/maxout
...
Add MaxOut layer
2019-03-22 12:18:53 +00:00
Lyndon White
401d3da884
no arg closures
2019-03-21 17:04:52 +00:00
Lyndon White
7d247ea25b
update docstring
2019-03-18 12:20:46 +00:00
Nick Robinson
f222555deb
Update src/Flux.jl
...
Co-Authored-By: oxinabox <oxinabox@ucc.asn.au>
2019-03-18 12:20:46 +00:00
Lyndon White
ca68bf9bec
correct casing
2019-03-18 12:20:46 +00:00
Lyndon White
838047f708
fix docs
2019-03-18 12:19:44 +00:00
Kristoffer Carlsson
b84a60e74e
Update src/layers/basic.jl
...
Co-Authored-By: oxinabox <oxinabox@ucc.asn.au>
2019-03-18 12:19:44 +00:00
Lyndon White
fcc3ec471a
Add MaxOut layer
2019-03-18 12:19:44 +00:00
Lyndon White
79de829fdc
move Dense's overloads to be near its defn
2019-03-18 12:18:14 +00:00
chengchingwen
934f0840b2
change API to dims
2019-03-14 21:51:28 +08:00
Manjunath Bhat
633f0df01f
Added new loss functions.
2019-03-12 02:31:42 +05:30
Joshua Whittemore
f061df3d23
resolves pull request #652 merge conflicts
2019-03-09 12:51:20 -08:00
Mike J Innes
b348e31f07
Merge pull request #667 from FluxML/donottrack
...
rm Tracker
2019-03-08 11:38:37 +00:00
Josh Whittemore
930ebaf217
Add module to make iris dataset available.
2019-03-07 16:56:23 -08:00
Manjunath Bhat
c6e51f5cc2
Made lambda and alpha of eltype(x)
2019-03-07 23:42:38 +05:30
Manjunath Bhat
47c1324476
Merge branch 'master' into patch-3
2019-03-07 23:08:40 +05:30
Manjunath Bhat
1d310d4532
Removed {typeof(p)}
2019-03-07 21:55:26 +05:30
thebhatman
f4543b7adf
Value of alpha updated and dot operations changed
2019-03-08 03:21:26 +05:30
David Pollack
7b9b64f1cb
change IN to in
2019-03-07 09:46:44 +01:00
David Pollack
83b4b3a714
changes based on PR comments
2019-03-07 09:46:44 +01:00
David Pollack
c41f891005
changes based on the improved batchnorm in PR#633
2019-03-07 09:46:44 +01:00
David Pollack
129a708b6f
instance normalization
2019-03-07 09:46:44 +01:00
Mike J Innes
b5a148fa37
rm Tracker
2019-03-07 01:33:02 +00:00
Mike J Innes
3a4c6274fa
Merge pull request #651 from FluxML/mji/dogfood
...
Refactor training loop
2019-03-06 16:53:24 +00:00
Mike J Innes
fc6232b779
Merge pull request #633 from Sklan/patch-3
...
Improving BatchNorm
2019-03-06 16:23:03 +00:00
thebhatman
8e5965ac41
Indentation fixed
2019-03-05 16:28:05 +05:30
thebhatman
d6608682fc
Suggested changes made
2019-03-05 16:18:50 +05:30
Manjunath Bhat
29b853e0bb
Made sure Gradients are not lost.
2019-03-04 22:17:19 +05:30
Manjunath Bhat
b5533ee00b
Exported AlphaDropout
2019-03-04 01:09:05 +05:30
Manjunath Bhat
97f874abcf
Added AlphaDropout which is used in SNNs.
2019-03-04 01:05:46 +05:30
Manjunath Bhat
704be49483
Added support for Float64 for DepthwiseConv
...
DepthwiseConv was giving errors for Float64. This fixes the issue.
2019-03-01 15:04:05 +05:30
Mike Innes
4cf43c0c41
simpler/nicer training loop
2019-02-28 14:58:42 +00:00
Mike Innes
cd091ad005
in place implicit gradients
2019-02-28 14:08:01 +00:00
Mike Innes
8b4bc7cc52
organise params
2019-02-28 13:44:54 +00:00
Dhairya Gandhi
6825639f79
mapreduce for onehotmatrix
2019-02-28 09:17:18 +05:30
Rohith Pentaparthy
1b1dff1266
Added an example of Conv to Flux.jl/src/layers/conv.jl, and clarified what WHCN means
2019-02-23 14:31:27 -06:00
Sklan
7463f09591
Update normalise.jl
2019-02-21 23:56:19 +05:30
Sklan
6044421c5c
Update normalise.jl
2019-02-20 13:47:31 +05:30
pshashk
b0a5844afb
Remove dims=1 from normalise ( #619 )
...
* remove `dims=1`
* add dims arg
* fix test
* remove dims=1 only from deprecated version
2019-02-11 16:11:47 +00:00
Dhairya Gandhi
2ec35861b5
removing non-allocating functions and tests
2019-02-11 21:22:32 +05:30
Dhairya Gandhi
35cd9761a8
adding tests
2019-02-09 22:32:02 +05:30
pshashk
b074b2491a
fix docstring
2019-02-08 21:49:53 +03:00
pshashk
c3e04392d8
drop dims type restriction
2019-02-08 16:15:37 +03:00
pshashk
911c901294
dims
kwarg
2019-02-08 16:00:32 +03:00
pshashk
368c29e5e3
Add corrected
argument to std
...
Fixes ffe037c485/src/layers/stateless.jl (L49)
2019-02-08 15:23:27 +03:00
Mike J Innes
ffe037c485
Merge pull request #603 from FluxML/kf/namedtupletree
...
Treat NamedTuple like Tuple for treelike purposes
2019-02-08 11:06:12 +00:00
Mike J Innes
601e2d8ae0
Merge pull request #586 from KristofferC/kc/batchnorm
...
work around extreme slowdown in BatchNorm due to julia performance bug in broadcast fusion
2019-02-08 11:00:33 +00:00
Mike J Innes
fe712bf338
Merge pull request #596 from IvanYashchuk/ivan/topic-issue-542
...
Fixed issue #542 .
2019-02-08 10:38:23 +00:00
Ivan Yashchuk
e00ac88016
Added tracking of logdet
and logabsdet
. Added gradtests.
2019-02-08 09:55:33 +02:00
Keno Fischer
1e452a3042
Treat NamedTuple like Tuple for treelike purposes
2019-02-06 11:11:00 -05:00
KristofferC
9914c531f6
work around extreme slowdown due julia performance bug
2019-02-06 16:19:29 +01:00
Mike J Innes
ecc55ec9e1
Revert "Fix OneHotVector/Matrix performance on GPU"
2019-02-06 14:31:15 +00:00
Mike J Innes
e8b2ec6f67
Merge pull request #311 from tejank10/conv_transpose
...
2D Conv transpose support
2019-02-06 14:14:14 +00:00
Moksh Jain
046f7b4eae
fix std arguments in normalise
2019-02-05 18:36:04 +05:30
Ivan Yashchuk
f790fff59a
Use other definition for grad(det(A)).
2019-02-05 14:36:28 +02:00
Moksh Jain
c6409d7686
add support for n-dimensional input to normalise layer
2019-02-05 17:09:22 +05:30
Ivan Yashchuk
aa64d2157d
Fixed issue #542 .
...
Added tracking of LinearAlgebra.det and its grad method.
2019-02-05 11:38:27 +02:00
Mike J Innes
940b1e6dbf
Merge pull request #587 from KristofferC/patch-2
...
use uncorrected standard deviation in normalise
2019-02-04 14:35:25 +00:00
Mike J Innes
7fc920240d
Merge pull request #591 from dhairyagandhi96/onehot
...
Fix OneHotVector/Matrix performance on GPU
2019-02-04 13:53:55 +00:00
Mike J Innes
17f33b4a6a
Merge pull request #583 from KristofferC/kc/small_fixes
...
clarify docs on single batch image to conv
2019-02-04 12:33:34 +00:00
Mike J Innes
e774053126
Merge pull request #590 from oxinabox/patch-2
...
Default to zero'ed initial state for all RNN
2019-02-04 12:28:38 +00:00
Mike J Innes
329c8f8f95
Merge pull request #585 from KristofferC/kc/verify_download
...
add hash verification to datasets
2019-02-04 11:20:53 +00:00