thebhatman
8d78b437ff
Merge branch 'sf/zygote_updated' of https://github.com/thebhatman/Flux.jl
2019-07-08 18:47:17 +05:30
thebhatman
812541f8d6
zeros replaced by fill to avoid nothing grad
2019-07-06 19:41:03 +05:30
thebhatman
3ee2a76f61
Removed .data from LSTMCell
2019-07-02 17:38:30 +05:30
thebhatman
b194e7e3a8
Callback being called now
2019-06-20 00:37:54 +05:30
Dhairya Gandhi
dd9cdbef14
remove uncessary call to beta
2019-06-16 19:09:50 +05:30
Dhairya Gandhi
67f18663d9
pick beta from state in NADAM
2019-06-16 19:06:59 +05:30
thebhatman
7ab9d8ed3d
Minor update
2019-06-13 18:59:03 +05:30
thebhatman
ce11804dc1
CrossCor test passing, hopefully.
2019-06-13 01:21:58 +05:30
thebhatman
48ed93cdaa
Silly error in Dropout corrected.
2019-06-12 23:16:15 +05:30
thebhatman
e9797408ec
DepthwiseConv corrected again.
2019-06-12 23:01:51 +05:30
thebhatman
00a4f4c26d
Correcting Dropout
2019-06-12 22:39:30 +05:30
thebhatman
bd7e3b1f41
Dropout with dims test passing.
2019-06-12 22:16:11 +05:30
thebhatman
c7c0ee2cbc
Resolving Merge Conflicts
2019-06-12 21:34:42 +05:30
thebhatman
dfd2965e85
GroupNorm tests corrected
2019-06-11 22:32:54 +05:30
thebhatman
11073dcd25
GroupNorm made to use istraining()
2019-06-11 22:04:33 +05:30
thebhatman
ef63f80644
No ops defined for param and data
2019-06-10 18:24:18 +05:30
Mike J Innes
b98075817c
Merge branch 'master' into DenseBlock
2019-06-05 14:27:47 +01:00
ayush-1506
2161163a82
added crosscor
2019-05-14 02:52:28 -07:00
Bruno Hebling Vieira
796a2957c9
Added news and removed type annotation from SkipConnection structure
2019-05-13 16:33:31 -03:00
Bruno Hebling Vieira
e7d76b8423
Added the SkipConnection layer and constructor
...
Added missing export
Corrected channel placement
Dimension 4 cannot be assumed to always be the Channel dimension
Deprecation of `treelike`
Code now makes use of `@treelike` macro instead of the deprecated `treelike` function (it worked on my end because I'm on Julia 0.7, while Julia 1.0 deprecated stuff)
Update basic.jl
Renaming to SkipConnection
* Update Flux.jl
* Update basic.jl
Updated `SkipConnection` with a `connection` field
I'm pretty sure I broke something now, but this PR should follow along these lines `cat` needs special treatment (the user can declare his own `concatenate` connection, but I foresee it's going to be used often so we can simply define special treatment)
Forgot to remove some rebasing text
Forgot to remove some more rebasing text
Removed local copy and default cat method from the function calls
Adjusted some more types for inference, could improve on this as well
Re-placed some left-over spaces
2019-05-13 16:32:00 -03:00
bors[bot]
68ba6e4e2f
Merge #563
...
563: noise shape for dropout r=MikeInnes a=chengchingwen
I add the noise shape for dropout, similar to the `noise_shape` argument in [`tf.nn.dropout`](https://www.tensorflow.org/api_docs/python/tf/nn/dropout )
Co-authored-by: chengchingwen <adgjl5645@hotmail.com>
Co-authored-by: Peter <adgjl5645@hotmail.com>
2019-05-13 17:16:10 +00:00
chengchingwen
2fc2a5282c
Merge remote-tracking branch 'upstream/master' into drop_shape
2019-05-14 00:50:59 +08:00
Elliot Saba
2e6561bb6a
Change DepthwiseConv()
to use in=>out
instead of in=>mult
.
...
This is an API change, but I think it makes more sense, and is more
consistent with our `Conv()` api.
2019-05-12 11:20:24 -07:00
chengchingwen
5c5140683c
make dims as field of Dropout
2019-05-10 23:45:50 +08:00
Mike J Innes
92ddc618f8
update for arrays
2019-05-02 18:57:52 -07:00
Mike J Innes
c70276ddfe
rm more deprecations
2019-05-02 18:57:52 -07:00
Mike J Innes
256695262c
rm optimiser deprecations
2019-05-02 18:54:01 -07:00
Mike J Innes
82ee61f5be
implement #643
2019-05-02 18:52:09 -07:00
Mike J Innes
c313be8e95
rm data/param
2019-05-02 18:52:09 -07:00
Mike J Innes
aa4d221f8c
break all the things
2019-05-02 18:50:52 -07:00
Avik Pal
a0be6fa837
Add missing activation function for batchnorm
2019-05-01 19:47:54 +05:30
Dhairya Gandhi
221670a2b1
Merge pull request #733 from thebhatman/expdecay-fix
...
Fixed ExpDecay
2019-05-01 18:58:37 +05:30
Dhairya Gandhi
9bbbd17e4b
Merge branch 'master' into onecold
2019-04-30 19:09:36 +05:30
Roger-luo
d63338c242
fix doctest
2019-04-26 18:12:14 +08:00
Mike J Innes
6c3a939133
Update src/onehot.jl
...
Co-Authored-By: Roger-luo <hiroger@qq.com>
2019-04-26 18:09:14 +08:00
Roger-luo
fabcd05ff2
add examples
2019-04-26 18:05:03 +08:00
Elliot Saba
732f97fe16
Split out conv_transpose_dims()
so that Zygote can ignore it
2019-04-25 10:24:19 -07:00
Elliot Saba
6e22cd4931
Add asymmetric padding to convolutional layers
2019-04-25 09:55:23 -07:00
Elliot Saba
113ddc8760
Update Flux
code for new NNlib branch
2019-04-25 09:55:23 -07:00
Hossein Pourbozorg
7f06b15f67
use https instead of http for web links
2019-04-25 11:04:03 +00:00
Jake Topping
ff7adda74b
Swap comma for full stop
...
"ERROR: LoadError: UndefVarError: G not defined" caused by "gn,G" rather than "gn.G" in line 386. Swapping for full stop should fix this
2019-04-22 17:08:36 +01:00
Zachary P Christensen
83eb5a1df6
Fix typo in Maxout
2019-04-19 17:02:26 -04:00
thebhatman
31a50ab16a
Fixed ExpDecay
2019-04-11 17:28:06 +05:30
Mike J Innes
54d9229be9
Merge pull request #710 from johnnychen94/master
...
naive implementation of activations
2019-04-05 15:33:31 +01:00
Johnny Chen
a300376f71
fix a typo in comment
...
`inplementation` --> `implementation`
2019-04-05 19:19:30 +08:00
JohnnyChen
3cafbbad02
simplify the implementation
2019-04-05 18:44:00 +08:00
JohnnyChen
de7a5f4024
correct the function behavior; support Any type
2019-04-05 18:16:44 +08:00
bors[bot]
bd9d73a941
Merge #655
...
655: Added support for Float64 for DepthwiseConv r=dhairyagandhi96 a=thebhatman
DepthwiseConv was giving errors for Float64. This fixes the issue.
Co-authored-by: Manjunath Bhat <manjunathbhat9920@gmail.com>
2019-04-04 17:25:52 +00:00
chengchingwen
261235311c
change dims
as unbroadcasted dims and keyword argument
2019-04-05 01:19:20 +08:00
Dhairya Gandhi
1963f30911
Merge pull request #726 from dhairyagandhi96/iris
...
use cached iris dataset
2019-04-04 22:46:21 +05:30