Mike J Innes
b98075817c
Merge branch 'master' into DenseBlock
2019-06-05 14:27:47 +01:00
ayush-1506
2161163a82
added crosscor
2019-05-14 02:52:28 -07:00
Bruno Hebling Vieira
796a2957c9
Added news and removed type annotation from SkipConnection structure
2019-05-13 16:33:31 -03:00
Bruno Hebling Vieira
e7d76b8423
Added the SkipConnection layer and constructor
...
Added missing export
Corrected channel placement
Dimension 4 cannot be assumed to always be the Channel dimension
Deprecation of `treelike`
Code now makes use of `@treelike` macro instead of the deprecated `treelike` function (it worked on my end because I'm on Julia 0.7, while Julia 1.0 deprecated stuff)
Update basic.jl
Renaming to SkipConnection
* Update Flux.jl
* Update basic.jl
Updated `SkipConnection` with a `connection` field
I'm pretty sure I broke something now, but this PR should follow along these lines `cat` needs special treatment (the user can declare his own `concatenate` connection, but I foresee it's going to be used often so we can simply define special treatment)
Forgot to remove some rebasing text
Forgot to remove some more rebasing text
Removed local copy and default cat method from the function calls
Adjusted some more types for inference, could improve on this as well
Re-placed some left-over spaces
2019-05-13 16:32:00 -03:00
bors[bot]
68ba6e4e2f
Merge #563
...
563: noise shape for dropout r=MikeInnes a=chengchingwen
I add the noise shape for dropout, similar to the `noise_shape` argument in [`tf.nn.dropout`](https://www.tensorflow.org/api_docs/python/tf/nn/dropout )
Co-authored-by: chengchingwen <adgjl5645@hotmail.com>
Co-authored-by: Peter <adgjl5645@hotmail.com>
2019-05-13 17:16:10 +00:00
chengchingwen
2fc2a5282c
Merge remote-tracking branch 'upstream/master' into drop_shape
2019-05-14 00:50:59 +08:00
Elliot Saba
2e6561bb6a
Change DepthwiseConv()
to use in=>out
instead of in=>mult
.
...
This is an API change, but I think it makes more sense, and is more
consistent with our `Conv()` api.
2019-05-12 11:20:24 -07:00
chengchingwen
5c5140683c
make dims as field of Dropout
2019-05-10 23:45:50 +08:00
Avik Pal
a0be6fa837
Add missing activation function for batchnorm
2019-05-01 19:47:54 +05:30
Dhairya Gandhi
221670a2b1
Merge pull request #733 from thebhatman/expdecay-fix
...
Fixed ExpDecay
2019-05-01 18:58:37 +05:30
Dhairya Gandhi
9bbbd17e4b
Merge branch 'master' into onecold
2019-04-30 19:09:36 +05:30
Roger-luo
d63338c242
fix doctest
2019-04-26 18:12:14 +08:00
Mike J Innes
6c3a939133
Update src/onehot.jl
...
Co-Authored-By: Roger-luo <hiroger@qq.com>
2019-04-26 18:09:14 +08:00
Roger-luo
fabcd05ff2
add examples
2019-04-26 18:05:03 +08:00
Elliot Saba
732f97fe16
Split out conv_transpose_dims()
so that Zygote can ignore it
2019-04-25 10:24:19 -07:00
Elliot Saba
6e22cd4931
Add asymmetric padding to convolutional layers
2019-04-25 09:55:23 -07:00
Elliot Saba
113ddc8760
Update Flux
code for new NNlib branch
2019-04-25 09:55:23 -07:00
Hossein Pourbozorg
7f06b15f67
use https instead of http for web links
2019-04-25 11:04:03 +00:00
Jake Topping
ff7adda74b
Swap comma for full stop
...
"ERROR: LoadError: UndefVarError: G not defined" caused by "gn,G" rather than "gn.G" in line 386. Swapping for full stop should fix this
2019-04-22 17:08:36 +01:00
Zachary P Christensen
83eb5a1df6
Fix typo in Maxout
2019-04-19 17:02:26 -04:00
thebhatman
31a50ab16a
Fixed ExpDecay
2019-04-11 17:28:06 +05:30
Mike J Innes
54d9229be9
Merge pull request #710 from johnnychen94/master
...
naive implementation of activations
2019-04-05 15:33:31 +01:00
Johnny Chen
a300376f71
fix a typo in comment
...
`inplementation` --> `implementation`
2019-04-05 19:19:30 +08:00
JohnnyChen
3cafbbad02
simplify the implementation
2019-04-05 18:44:00 +08:00
JohnnyChen
de7a5f4024
correct the function behavior; support Any type
2019-04-05 18:16:44 +08:00
bors[bot]
bd9d73a941
Merge #655
...
655: Added support for Float64 for DepthwiseConv r=dhairyagandhi96 a=thebhatman
DepthwiseConv was giving errors for Float64. This fixes the issue.
Co-authored-by: Manjunath Bhat <manjunathbhat9920@gmail.com>
2019-04-04 17:25:52 +00:00
chengchingwen
261235311c
change dims
as unbroadcasted dims and keyword argument
2019-04-05 01:19:20 +08:00
Dhairya Gandhi
1963f30911
Merge pull request #726 from dhairyagandhi96/iris
...
use cached iris dataset
2019-04-04 22:46:21 +05:30
Dhairya Gandhi
9c8175b1c0
fixes
2019-04-04 22:32:01 +05:30
Dhairya Gandhi
4f754d33cb
switch to http link
2019-04-04 22:18:38 +05:30
Dhairya Gandhi
38cc216a4b
switch to azure
2019-04-04 22:03:01 +05:30
Dhairya Gandhi
77274b4af7
change iris link
2019-04-04 21:07:46 +05:30
Dhairya Gandhi
2952bcdab1
fixes
2019-04-04 19:28:40 +05:30
Dhairya Gandhi
5b9c53439b
recreate OHV
2019-04-04 19:19:47 +05:30
Dhairya Gandhi
4f1336905f
fix colon indexing
2019-04-04 19:16:14 +05:30
Shreyas
4cb7b9278b
Minor changes to docstring according to guidelines
2019-03-30 00:28:23 +05:30
JohnnyChen
82595648e2
change 4-spaces tab to 2-spaces tab
2019-03-28 22:40:24 +08:00
Shreyas
b6fcd1d837
Added export to Maxout
2019-03-28 19:15:16 +05:30
JohnnyChen
13c58494ec
add x into results
2019-03-28 19:28:59 +08:00
JohnnyChen
5c2a071713
add support for 0-element Chain
2019-03-28 17:20:41 +08:00
JohnnyChen
ccfe0f8720
naive implementation of activations
2019-03-28 17:07:04 +08:00
Shreyas
61c1fbd013
Made Requested Changes
2019-03-28 01:33:04 +05:30
Shreyas
671aed963e
Made a few fixes. Added tests
2019-03-28 00:51:50 +05:30
Shreyas
595f1cf6eb
Made Requested Changes
2019-03-26 21:42:49 +05:30
Lyndon White
f0cc4a328d
make Maxout trainable
2019-03-25 16:02:46 +00:00
Tim Besard
0734eeb50e
Check CuArrays major version.
2019-03-22 14:15:26 +01:00
Dhairya Gandhi
bc06861320
fix indirect import
2019-03-22 14:15:26 +01:00
Tim Besard
959dd247bf
Import CUDAdrv stuff through CuArrays.
2019-03-22 14:15:26 +01:00
Tim Besard
df509ce9f0
Adapt to the new CUDAdrv.CuPtr pointer type.
2019-03-22 14:15:26 +01:00
Mike J Innes
b637311642
Merge pull request #647 from oxinabox/ox/maxout
...
Add MaxOut layer
2019-03-22 12:18:53 +00:00