Commit Graph

121 Commits

Author SHA1 Message Date
bors[bot]
55616afc11
Merge #960
960: Added utility function outdims to compute output dimensions of a layer r=dhairyagandhi96 a=darsnack

Based on Slack chatter, I added a utility function, `outdims`, that computes the output dimensions for given input dimensions.

Example
```julia
layer = Conv((3, 3), 3 => 16)
outdims(layer, (10, 10)) # returns (8, 8)
```

Co-authored-by: Kyle Daruwalla <daruwalla@wisc.edu>
2020-02-25 17:40:05 +00:00
bors[bot]
d1edd9b16d
Merge #680
680: Added new loss functions. r=thebhatman a=thebhatman

I have added the KL Divergence Loss function, Poisson loss function, Logcosh loss, and Hinge loss function.

Co-authored-by: Manjunath Bhat <manjunathbhat9920@gmail.com>
Co-authored-by: thebhatman <manjunathbhat9920@gmail.com>
2020-01-13 15:46:25 +00:00
Manjunath Bhat
747e01ea02
Test to check for spurious promotions 2020-01-13 18:33:30 +05:30
Elliot Saba
0fdcc00923 Give NNPACK a bit of numerical leeway 2019-12-23 01:31:26 -08:00
Kyle Daruwalla
0cdd11c0dc Added tests for varying padding, stride, and dilation with outdims. 2019-12-07 14:05:50 -06:00
Kyle Daruwalla
6265b1fa39 Added tests for outdims 2019-12-05 22:54:25 -06:00
Dhairya Gandhi
c031ae1a94 correct channel value 2019-11-24 13:31:31 +05:30
Dhairya Gandhi
5f21238d1a no grad dims helper 2019-11-24 13:25:02 +05:30
dsweber2
dea29532ef Merge branch 'master' into activations 2019-11-15 17:19:43 -08:00
Mike Innes
e24215ca98 guard test on 1.0 2019-11-15 15:59:42 +00:00
dsweber2
58c794702d simpler test 2019-11-14 14:05:53 -08:00
dsweber2
db92b0e3ce super simple test 2019-11-14 13:40:52 -08:00
Manjunath Bhat
2b30319a55
Merge branch 'master' into patch-6 2019-09-30 21:05:02 +05:30
thebhatman
6e289ef939 Merge branch 'patch-6' of https://github.com/thebhatman/Flux.jl into patch-6 2019-09-30 20:55:44 +05:30
bors[bot]
acb6a89245
Merge #865
865: Functor r=MikeInnes a=MikeInnes

This refactors our current `@treelike` infrastructure. It somewhat formalises what we're doing around the idea of a Flux model as a functor, i.e. something that can be mapped over.

This is much more flexible than what we had before, and avoids some issues. It allows layers to have state that isn't mappable; it allows for dispatch when walking the tree, which means layers like `BatchNorm` can have non-trainable parameters; and it also allows for zipped mapping like `fmap(+, xs, ys)`, which isn't implemented yet but will be useful for the new optimisers work.

The main downside is that the term `functor` has been previously used in the Julia community as a malapropism for "thing that behaves like a function"; but hopefully this can start to reduce that usage.

Co-authored-by: Mike Innes <mike.j.innes@gmail.com>
2019-09-24 16:36:10 +00:00
Mike Innes
b60df53ba1 pkg up 2019-09-19 18:33:33 +01:00
Mike Innes
b951377426 fix normalisation layer params 2019-09-19 15:33:24 +01:00
Mike Innes
f8d5d3b5fc broken normalisation layer params 2019-09-19 14:12:11 +01:00
Mike Innes
250aef5a5a normalise test fixes 2019-09-10 16:19:55 +01:00
Mike Innes
2f7ad895aa test cleanups 2019-08-19 15:22:50 +01:00
Mike Innes
9590aa63e3 rm last uses of param/data 2019-08-19 15:14:42 +01:00
thebhatman
8d6028e27a tests with gradients 2019-07-12 20:47:43 +05:30
Mike Innes
e2bf46b7fd gpu test fixes 2019-07-12 14:52:01 +01:00
thebhatman
e6d5846e49 Temporary removal of Float16 test 2019-06-14 23:24:31 +05:30
thebhatman
1ff4e3188e back on mse failing for Float16 2019-06-13 16:41:25 +05:30
thebhatman
c7c0ee2cbc Resolving Merge Conflicts 2019-06-12 21:34:42 +05:30
thebhatman
a56cfb73c3 BatchNorm test corrected 2019-06-11 20:34:48 +05:30
thebhatman
f465665c73 Corrected test for asymmetric padding 2019-06-11 20:20:00 +05:30
thebhatman
94a2d1987d Updated tests of normalisation layers. 2019-06-11 20:05:07 +05:30
Mike J Innes
b98075817c
Merge branch 'master' into DenseBlock 2019-06-05 14:27:47 +01:00
ayush-1506
98a027a505 typo 2019-05-14 02:56:12 -07:00
ayush-1506
bfc5bb0079 rebase 2019-05-14 02:53:48 -07:00
ayush-1506
0a2e288c3f another small test 2019-05-14 02:53:06 -07:00
ayush-1506
2161163a82 added crosscor 2019-05-14 02:52:28 -07:00
ayush-1506
7c28f7f883 Merge branch 'crosscor' of https://github.com/ayush-1506/Flux.jl into crosscor 2019-05-14 02:47:28 -07:00
Bruno Hebling Vieira
c5fc2fb9a3 Added tests 2019-05-13 16:32:00 -03:00
bors[bot]
68ba6e4e2f Merge #563
563: noise shape for dropout r=MikeInnes a=chengchingwen

I add the noise shape for dropout, similar to the `noise_shape` argument in [`tf.nn.dropout`](https://www.tensorflow.org/api_docs/python/tf/nn/dropout)

Co-authored-by: chengchingwen <adgjl5645@hotmail.com>
Co-authored-by: Peter <adgjl5645@hotmail.com>
2019-05-13 17:16:10 +00:00
chengchingwen
2fc2a5282c Merge remote-tracking branch 'upstream/master' into drop_shape 2019-05-14 00:50:59 +08:00
Elliot Saba
48fcc66094 Remove vestigial testing println() 2019-05-12 11:20:24 -07:00
Elliot Saba
2e6561bb6a Change DepthwiseConv() to use in=>out instead of in=>mult.
This is an API change, but I think it makes more sense, and is more
consistent with our `Conv()` api.
2019-05-12 11:20:24 -07:00
chengchingwen
5c5140683c make dims as field of Dropout 2019-05-10 23:45:50 +08:00
ayush-1506
99d07e67db another small test 2019-05-09 16:43:28 +05:30
ayush-1506
9a3aa18c17 conflicts 2019-05-08 11:56:46 +05:30
Jan Weidner
e96a9d7eaf Switch broken #700 test to pass 2019-05-03 22:36:32 +02:00
Jan Weidner
73c5d9f25c fix 2019-05-03 22:22:52 +02:00
Jan Weidner
27a9a7b9cf add broken test for #700 2019-05-03 22:22:52 +02:00
Mike J Innes
5b79453773 passing tests... ish 2019-05-02 18:54:01 -07:00
Mike J Innes
0c265f305a fix most tests 2019-05-02 18:52:09 -07:00
ayush-1506
20b79e0bdf added crosscor 2019-05-01 22:29:00 +05:30
Elliot Saba
6e22cd4931 Add asymmetric padding to convolutional layers 2019-04-25 09:55:23 -07:00