Commit Graph

43 Commits

Author SHA1 Message Date
Dhairya Gandhi 5086c0f4f0 merge conflicts 2020-04-29 16:11:39 +05:30
DrChainsaw 1544f84bb9 Fix merge conflicts 2020-04-24 21:56:26 +02:00
Garben Tanghe 746e3310f1 removed Flatten struct
updated documentation
2020-03-08 14:22:03 +01:00
Garben Tanghe 3e14bd878c added GlobalMaxPool, GlobalMeanPool, and Flatten layers 2020-03-08 14:18:48 +01:00
Dhairya Gandhi cd931793ef more docs and constructors 2020-02-26 22:29:14 +05:30
bors[bot] 55616afc11
Merge #960
960: Added utility function outdims to compute output dimensions of a layer r=dhairyagandhi96 a=darsnack

Based on Slack chatter, I added a utility function, `outdims`, that computes the output dimensions for given input dimensions.

Example
```julia
layer = Conv((3, 3), 3 => 16)
outdims(layer, (10, 10)) # returns (8, 8)
```

Co-authored-by: Kyle Daruwalla <daruwalla@wisc.edu>
2020-02-25 17:40:05 +00:00
Elliot Saba 0fdcc00923 Give `NNPACK` a bit of numerical leeway 2019-12-23 01:31:26 -08:00
Kyle Daruwalla 0cdd11c0dc Added tests for varying padding, stride, and dilation with outdims. 2019-12-07 14:05:50 -06:00
Kyle Daruwalla 6265b1fa39 Added tests for outdims 2019-12-05 22:54:25 -06:00
DrChainsaw 755536bf5e Merge remote-tracking branch 'upstream/master' into samepad 2019-12-04 23:45:03 +01:00
Dhairya Gandhi ec872bb579 test that bias has no grads with Zeros 2019-11-27 19:45:04 +05:30
Dhairya Gandhi c031ae1a94 correct channel value 2019-11-24 13:31:31 +05:30
Dhairya Gandhi 5f21238d1a no grad dims helper 2019-11-24 13:25:02 +05:30
DrChainsaw 411ce5dbd8 Add SamePad for pooling layers 2019-10-20 13:43:39 +02:00
DrChainsaw fc123d6279 Add SamePad for conv layers 2019-10-20 13:43:23 +02:00
Dhairya Gandhi 49ea43e711 ZeroType => Zeros 2019-10-08 20:02:04 +05:30
Dhairya Gandhi b596faaffa tests bias switch 2019-10-08 17:18:39 +05:30
Dhairya Gandhi 55ef7c1aba add weight and bias kwargs 2019-10-06 04:25:23 +05:30
Dhairya Gandhi dced8c04e5 use ZeroType 2019-10-01 21:25:07 +05:30
Dhairya Gandhi 5ea6a33f44 make bias optional 2019-09-27 11:48:12 +05:30
Mike Innes 9590aa63e3 rm last uses of param/data 2019-08-19 15:14:42 +01:00
thebhatman c7c0ee2cbc Resolving Merge Conflicts 2019-06-12 21:34:42 +05:30
thebhatman f465665c73 Corrected test for asymmetric padding 2019-06-11 20:20:00 +05:30
ayush-1506 98a027a505 typo 2019-05-14 02:56:12 -07:00
ayush-1506 bfc5bb0079 rebase 2019-05-14 02:53:48 -07:00
ayush-1506 0a2e288c3f another small test 2019-05-14 02:53:06 -07:00
ayush-1506 2161163a82 added crosscor 2019-05-14 02:52:28 -07:00
ayush-1506 7c28f7f883 Merge branch 'crosscor' of https://github.com/ayush-1506/Flux.jl into crosscor 2019-05-14 02:47:28 -07:00
Elliot Saba 2e6561bb6a Change `DepthwiseConv()` to use `in=>out` instead of `in=>mult`.
This is an API change, but I think it makes more sense, and is more
consistent with our `Conv()` api.
2019-05-12 11:20:24 -07:00
ayush-1506 99d07e67db another small test 2019-05-09 16:43:28 +05:30
ayush-1506 9a3aa18c17 conflicts 2019-05-08 11:56:46 +05:30
Jan Weidner e96a9d7eaf Switch broken #700 test to pass 2019-05-03 22:36:32 +02:00
Jan Weidner 73c5d9f25c fix 2019-05-03 22:22:52 +02:00
Jan Weidner 27a9a7b9cf add broken test for #700 2019-05-03 22:22:52 +02:00
ayush-1506 20b79e0bdf added crosscor 2019-05-01 22:29:00 +05:30
Elliot Saba 6e22cd4931 Add asymmetric padding to convolutional layers 2019-04-25 09:55:23 -07:00
Elliot Saba 113ddc8760 Update `Flux` code for new NNlib branch 2019-04-25 09:55:23 -07:00
Manjunath Bhat d4a1d33a31
Added Float64 tests for DepthwiseConv 2019-03-09 20:17:22 +05:30
Avik Pal 2f3ad56166 Add test for Depthwise Conv 2019-01-24 18:53:04 +05:30
Mike J Innes 903db70673 float32 param initialisers 2018-11-12 20:10:47 +00:00
Mike J Innes 1e90226077 actually run tests 2018-09-04 14:35:20 +01:00
Mike J Innes 1e0fd07b09 use `expand` 2018-09-04 14:30:02 +01:00
Yueh-Hua Tu 634d34686e Add new constructors and test 2018-08-24 10:31:13 +08:00