Commit Graph

2031 Commits

Author SHA1 Message Date
Viral B. Shah 5689b39538
Create FUNDING.yml 2019-06-26 17:51:54 -04:00
Mike J Innes e88440974b
Merge pull request #796 from dhairyagandhi96/nadam
Pick beta from the state - NADAM
2019-06-19 22:18:56 +01:00
Dhairya Gandhi dd9cdbef14 remove uncessary call to beta 2019-06-16 19:09:50 +05:30
Dhairya Gandhi 67f18663d9 pick beta from state in NADAM 2019-06-16 19:06:59 +05:30
Dhairya Gandhi b47238eb74
Merge pull request #793 from amellnik/typos
Two minor typos in docs
2019-06-12 11:31:06 +05:30
Alex Mellnik e17999f19b Two minor typos 2019-06-11 22:09:59 -07:00
bors[bot] 1902c0e7c5 Merge #446
446: Added the SkipConnection layer and constructor r=MikeInnes a=bhvieira

I added a DenseBlock constructor, which allows one to train DenseNets (you can train ResNets and MixNets with this as well, only need change the connection, which is concatenation for DenseNets).

Disclaimer: I created the block for a 3D U-Net, so the assumption here is that whatever layer is inside the block, its output has the same spatial dimension (i.e. all array dimensions excluding the channel and minibatch dimensions) as the input, otherwise the connection wouldn't match. I'm not sure this matches the topology of every DenseNet there is out there, but I suppose this is a good starting point.

No tests yet, will add them as the PR evolve.

I'm open to suggestions! :)


Co-authored-by: Bruno Hebling Vieira <bruno.hebling.vieira@usp.br>
Co-authored-by: Mike J Innes <mike.j.innes@gmail.com>
2019-06-05 13:28:41 +00:00
Mike J Innes b98075817c
Merge branch 'master' into DenseBlock 2019-06-05 14:27:47 +01:00
bors[bot] 8ee6af1bee Merge #762
762: CrossCor layer r=avik-pal a=ayush-1506

Same as #423 (which could be edited since I lost access to that github account).

Co-authored-by: ayush-1506 <ayush.shridhar1506@gmail.com>
2019-05-14 10:36:22 +00:00
ayush-1506 98a027a505 typo 2019-05-14 02:56:12 -07:00
ayush-1506 bfc5bb0079 rebase 2019-05-14 02:53:48 -07:00
ayush-1506 f263f0c8ed add to layer docs 2019-05-14 02:53:06 -07:00
ayush-1506 0a2e288c3f another small test 2019-05-14 02:53:06 -07:00
ayush-1506 2161163a82 added crosscor 2019-05-14 02:52:28 -07:00
ayush-1506 451b80da3d add to layer docs 2019-05-14 02:50:18 -07:00
ayush-1506 7c28f7f883 Merge branch 'crosscor' of https://github.com/ayush-1506/Flux.jl into crosscor 2019-05-14 02:47:28 -07:00
Bruno Hebling Vieira 6b3cd825b9 Added SkipConnection to docs tentatively in Other General Purporse Layers 2019-05-13 16:43:14 -03:00
Bruno Hebling Vieira 796a2957c9 Added news and removed type annotation from SkipConnection structure 2019-05-13 16:33:31 -03:00
Bruno Hebling Vieira c5fc2fb9a3 Added tests 2019-05-13 16:32:00 -03:00
Bruno Hebling Vieira e7d76b8423 Added the SkipConnection layer and constructor
Added missing export

Corrected channel placement

Dimension 4 cannot be assumed to always be the Channel dimension

Deprecation of `treelike`

Code now makes use of `@treelike` macro instead of the deprecated `treelike` function (it worked on my end because I'm on Julia 0.7, while Julia 1.0 deprecated stuff)

Update basic.jl

Renaming to SkipConnection

* Update Flux.jl

* Update basic.jl

Updated `SkipConnection` with a `connection` field

I'm pretty sure I broke something now, but this PR should follow along these lines `cat` needs special treatment (the user can declare his own `concatenate` connection, but I foresee it's going to be used often so we can simply define special treatment)

Forgot to remove some rebasing text

Forgot to remove some more rebasing text

Removed local copy and default cat method from the function calls

Adjusted some more types for inference, could improve on this as well

Re-placed some left-over spaces
2019-05-13 16:32:00 -03:00
Dhairya Gandhi 308b199bd0
Merge pull request #774 from zsz00/patch-1
typo of comvolutional in NEWS.md
2019-05-14 00:37:17 +05:30
zy a27be0f9ec
typo of comvolutional
comvolutional  -> convolutional
2019-05-14 01:24:45 +08:00
bors[bot] 68ba6e4e2f Merge #563
563: noise shape for dropout r=MikeInnes a=chengchingwen

I add the noise shape for dropout, similar to the `noise_shape` argument in [`tf.nn.dropout`](https://www.tensorflow.org/api_docs/python/tf/nn/dropout)

Co-authored-by: chengchingwen <adgjl5645@hotmail.com>
Co-authored-by: Peter <adgjl5645@hotmail.com>
2019-05-13 17:16:10 +00:00
Peter 9c1bb93aa3
Update NEWS.md
Co-Authored-By: Mike J Innes <mike.j.innes@gmail.com>
2019-05-14 01:12:59 +08:00
chengchingwen bdf74fe342 update NEWS 2019-05-14 00:57:42 +08:00
chengchingwen 2fc2a5282c Merge remote-tracking branch 'upstream/master' into drop_shape 2019-05-14 00:50:59 +08:00
bors[bot] 16fc41cd00 Merge #756
756: Change `DepthwiseConv()` to use `in=>out` instead of `in=>mult`. r=MikeInnes a=staticfloat

This is an API change, but I think it makes more sense, and is more consistent with our `Conv()` API.  This also dumps the `DepthwiseConv((3,3), C_in)` API, as I'm not sure why you would want to specify only the input channel count and default the output to a channel multiplier of 1; if anything I would think you'd want to specify the channel output and leave the input to be default.  In any case, I think consistency with `Conv()` is the best thing to chase after here.

Co-authored-by: Elliot Saba <staticfloat@gmail.com>
2019-05-13 16:37:57 +00:00
Mike J Innes 5931b93e09
Merge pull request #772 from johnnychen94/patch-1
delete redundant section
2019-05-13 17:33:01 +01:00
Elliot Saba 06da965301 Add `NEWS.md` entry for https://github.com/FluxML/Flux.jl/pull/756 2019-05-12 11:20:41 -07:00
Elliot Saba 48fcc66094 Remove vestigial testing `println()` 2019-05-12 11:20:24 -07:00
Elliot Saba 2e6561bb6a Change `DepthwiseConv()` to use `in=>out` instead of `in=>mult`.
This is an API change, but I think it makes more sense, and is more
consistent with our `Conv()` api.
2019-05-12 11:20:24 -07:00
Johnny Chen 7103a61a1f
delete redundant section 2019-05-11 12:40:01 +08:00
chengchingwen 5c5140683c make dims as field of Dropout 2019-05-10 23:45:50 +08:00
ayush-1506 99d07e67db another small test 2019-05-09 16:43:28 +05:30
ayush-1506 9a3aa18c17 conflicts 2019-05-08 11:56:46 +05:30
Tejan Karmali 79534caca1
Merge pull request #701 from jw3126/test700
Add tests for on quadratic Conv (#700)
2019-05-08 11:09:38 +05:30
Viral B. Shah 7c897394dd
Create CITATION.bib 2019-05-04 18:49:19 -04:00
Jan Weidner e96a9d7eaf Switch broken #700 test to pass 2019-05-03 22:36:32 +02:00
Jan Weidner 73c5d9f25c fix 2019-05-03 22:22:52 +02:00
Jan Weidner 27a9a7b9cf add broken test for #700 2019-05-03 22:22:52 +02:00
ayush-1506 20b79e0bdf added crosscor 2019-05-01 22:29:00 +05:30
bors[bot] e991228047 Merge #761
761: Fixes #760 r=MikeInnes a=avik-pal



Co-authored-by: Avik Pal <avikpal@iitk.ac.in>
2019-05-01 14:23:08 +00:00
Avik Pal a0be6fa837
Add missing activation function for batchnorm 2019-05-01 19:47:54 +05:30
Dhairya Gandhi 8355d57c79
Merge pull request #759 from dhairyagandhi96/tag_083
bump version to v0.8.3
2019-05-01 18:59:36 +05:30
Dhairya Gandhi 221670a2b1
Merge pull request #733 from thebhatman/expdecay-fix
Fixed ExpDecay
2019-05-01 18:58:37 +05:30
thebhatman 5ffc3b2d40 Comparing decay steps with expected true decay steps 2019-05-02 00:12:14 +05:30
thebhatman 5e06d8bb76 Test for decay_step 2019-05-01 23:10:00 +05:30
Dhairya Gandhi eff600642a
Merge pull request #612 from dhairyagandhi96/onecold
Fixes OneHotMatrix/Vector GPU Performance
2019-04-30 19:40:19 +05:30
Dhairya Gandhi 9bbbd17e4b
Merge branch 'master' into onecold 2019-04-30 19:09:36 +05:30
Dhairya Gandhi 3d5b76c0df bump version to v0.8.3 2019-04-29 22:01:46 +05:30