Kyle Daruwalla
c001d0f3c5
Added trainmode! and updated docs with warning
2020-03-01 12:30:41 -06:00
Kyle Daruwalla
568ecb1c97
Removed trainmode from tests
2020-02-29 16:25:18 -06:00
Kyle Daruwalla
5cbd2cecf2
Changed testmode! to return model
2020-02-29 16:09:59 -06:00
Kyle Daruwalla
7c12af065a
Added testmode! functionality back to normalization layers.
2020-02-21 14:35:10 -06:00
Mike Innes
e24215ca98
guard test on 1.0
2019-11-15 15:59:42 +00:00
bors[bot]
acb6a89245
Merge #865
...
865: Functor r=MikeInnes a=MikeInnes
This refactors our current `@treelike` infrastructure. It somewhat formalises what we're doing around the idea of a Flux model as a functor, i.e. something that can be mapped over.
This is much more flexible than what we had before, and avoids some issues. It allows layers to have state that isn't mappable; it allows for dispatch when walking the tree, which means layers like `BatchNorm` can have non-trainable parameters; and it also allows for zipped mapping like `fmap(+, xs, ys)`, which isn't implemented yet but will be useful for the new optimisers work.
The main downside is that the term `functor` has been previously used in the Julia community as a malapropism for "thing that behaves like a function"; but hopefully this can start to reduce that usage.
Co-authored-by: Mike Innes <mike.j.innes@gmail.com>
2019-09-24 16:36:10 +00:00
Mike Innes
b60df53ba1
pkg up
2019-09-19 18:33:33 +01:00
Mike Innes
b951377426
fix normalisation layer params
2019-09-19 15:33:24 +01:00
Mike Innes
f8d5d3b5fc
broken normalisation layer params
2019-09-19 14:12:11 +01:00
Mike Innes
250aef5a5a
normalise test fixes
2019-09-10 16:19:55 +01:00
Mike Innes
9590aa63e3
rm last uses of param/data
2019-08-19 15:14:42 +01:00
thebhatman
8d6028e27a
tests with gradients
2019-07-12 20:47:43 +05:30
Mike Innes
e2bf46b7fd
gpu test fixes
2019-07-12 14:52:01 +01:00
thebhatman
c7c0ee2cbc
Resolving Merge Conflicts
2019-06-12 21:34:42 +05:30
thebhatman
a56cfb73c3
BatchNorm test corrected
2019-06-11 20:34:48 +05:30
thebhatman
94a2d1987d
Updated tests of normalisation layers.
2019-06-11 20:05:07 +05:30
bors[bot]
68ba6e4e2f
Merge #563
...
563: noise shape for dropout r=MikeInnes a=chengchingwen
I add the noise shape for dropout, similar to the `noise_shape` argument in [`tf.nn.dropout`](https://www.tensorflow.org/api_docs/python/tf/nn/dropout )
Co-authored-by: chengchingwen <adgjl5645@hotmail.com>
Co-authored-by: Peter <adgjl5645@hotmail.com>
2019-05-13 17:16:10 +00:00
chengchingwen
2fc2a5282c
Merge remote-tracking branch 'upstream/master' into drop_shape
2019-05-14 00:50:59 +08:00
Elliot Saba
48fcc66094
Remove vestigial testing println()
2019-05-12 11:20:24 -07:00
chengchingwen
5c5140683c
make dims as field of Dropout
2019-05-10 23:45:50 +08:00
Mike J Innes
5b79453773
passing tests... ish
2019-05-02 18:54:01 -07:00
Mike J Innes
0c265f305a
fix most tests
2019-05-02 18:52:09 -07:00
chengchingwen
261235311c
change dims
as unbroadcasted dims and keyword argument
2019-04-05 01:19:20 +08:00
Shreyas
c810fd4818
Corrected Group Size In Batch Norm Test For Group Norm
2019-03-28 01:35:38 +05:30
Shreyas
61c1fbd013
Made Requested Changes
2019-03-28 01:33:04 +05:30
Shreyas
671aed963e
Made a few fixes. Added tests
2019-03-28 00:51:50 +05:30
chengchingwen
59da68b4d9
update test
2019-03-14 21:55:37 +08:00
David Pollack
83b4b3a714
changes based on PR comments
2019-03-07 09:46:44 +01:00
David Pollack
129a708b6f
instance normalization
2019-03-07 09:46:44 +01:00
KristofferC
9914c531f6
work around extreme slowdown due julia performance bug
2019-02-06 16:19:29 +01:00
chengchingwen
06003b72c7
noise shape for dropout
2019-01-22 23:51:38 +08:00
Avik Pal
cc812a8f89
Fix tests
2018-09-11 17:30:54 +05:30
Avik Pal
dd2fa77681
Fix tests
2018-09-11 17:06:18 +05:30
Avik Pal
7e7a501efd
Fix tests
2018-09-11 16:32:14 +05:30
Avik Pal
8bea60d980
Merge branch 'master' into cudnn_batchnorm
2018-09-11 15:34:25 +05:30
Avik Pal
d3c78a80be
Fix layers errors
2018-08-11 17:20:27 +05:30
Avik Pal
3b448ce1ac
Merge branch 'master' into cudnn_batchnorm
2018-08-11 15:02:55 +05:30
pevnak
3510c837a8
zeros replaced by zero
2018-08-03 15:14:25 +01:00
Avik Pal
da7fe93b31
Fix test
2018-07-17 09:47:45 +05:30
Avik Pal
646db81f94
Pull BatchNorm CPU updates
2018-07-17 09:24:38 +05:30
CarloLucibello
185e9148b6
fix cpu batchnorm
2018-07-16 07:11:33 +02:00
Avik Pal
a4e35e9e91
Adjust atol in tests
2018-06-20 16:22:25 +05:30
Mike J Innes
5fd240f525
interface tweaks
2018-04-15 20:04:42 +01:00
Brad Safnuk
07b0f95d61
Tests for batch norm with 2D and 3D convolutions.
2018-03-15 22:52:09 -04:00
Brad Safnuk
6653ec86d9
Allow multidimensional inputs to batchnorm.
...
Can be used in conjunction with convolutional layers, in addition
to dense layers, with the same api.
2018-03-15 21:48:59 -04:00
Iblis Lin
7f5ba594a9
batchnorm: more test cases
2017-11-02 13:32:12 +08:00
Iblis Lin
ce46843459
batchnorm: add test cases
2017-11-02 13:32:12 +08:00
Mike J Innes
cf6b930f63
reorganise
2017-10-26 11:46:12 +01:00