Commit Graph

462 Commits

Author SHA1 Message Date
janEbert
2f955a33cd src/layers/stateless.jl: Add missing docstrings 2020-04-04 17:36:23 +02:00
AzamatB
85a9493722
Fix typo in the docstrings of AlphaDropout 2020-03-14 15:42:00 +06:00
AzamatB
f0d866b2fd
fix typo in the Dropout docs 2020-03-10 12:44:19 +06:00
Garben Tanghe
fc3af681ec updated documentation 2020-03-08 14:22:09 +01:00
Garben Tanghe
746e3310f1 removed Flatten struct
updated documentation
2020-03-08 14:22:03 +01:00
Garben Tanghe
82e16a5b29 split up Flatten layer to use the flatten function 2020-03-08 14:21:59 +01:00
Garben Tanghe
3e14bd878c added GlobalMaxPool, GlobalMeanPool, and Flatten layers 2020-03-08 14:18:48 +01:00
Dhairya Gandhi
5a4f1932a6 closes #1071 2020-03-04 17:22:45 +05:30
bors[bot]
94ba1e8ede
Merge #1028 #1070
1028: Common questions answered in docs r=CarloLucibello a=dhairyagandhi96

cc @MikeInnes 

1070: Prevent breakage due to new `active` field in normalise layers r=CarloLucibello a=ianshmean

Prevents breakage where the normalise structs, such as `BatchNorm`, have been directly defined but missing the new `active` field

cc. @darsnack 

Co-authored-by: Dhairya Gandhi <dhairya@juliacopmuting.com>
Co-authored-by: Dhairya Gandhi <dhairya@juliacomputing.com>
Co-authored-by: Ian <i.r.butterworth@gmail.com>
2020-03-04 00:10:39 +00:00
bors[bot]
af23a5756c
Merge #1053
1053: Added Some Loss functions with some doc improvements r=CarloLucibello a=AdarshKumar712

Added the following loss functions with tests:
1. mae
2. mean squared logarithmic error
3. huber loss
4. squared hinge loss
5. dice coeff loss
6. tversky loss 

Also added some documentation improvements for few other functions. 

Co-authored-by: Adarsh Kumar <45385384+AdarshKumar712@users.noreply.github.com>
2020-03-03 23:56:21 +00:00
Ian
61f66e3dcd remove unnecessary helper for AlphaDropout 2020-03-03 13:20:02 -05:00
Ian
d63fcf2cb4 add depreciation reminder 2020-03-03 13:05:03 -05:00
Ian
d9ea5fba76 add active helpers for other normalise layers 2020-03-03 11:55:39 -05:00
Ian
0def352383 Prevent breakage due to new active field in BatchNorm 2020-03-03 11:49:34 -05:00
Adarsh Kumar
6e5c18bddf
Updated loss functions 2020-03-03 16:02:57 +05:30
Adarsh Kumar
2f05094068
Added consistency with ŷ and unicode chars 2020-03-02 20:00:47 +05:30
Adarsh Kumar
f9e31a020c
Updated huber_loss with other minute changes 2020-03-02 13:25:23 +05:30
Kyle Daruwalla
4cebf36361
Merge branch 'master' into feature/istraining 2020-03-01 12:32:15 -06:00
Adarsh Kumar
08dabce57e
Updated loss function docs 2020-03-01 12:00:11 +05:30
Adarsh Kumar
57c1b67d08
Merge branch 'master' into patch-1 2020-03-01 11:49:33 +05:30
Kyle Daruwalla
5cbd2cecf2 Changed testmode! to return model 2020-02-29 16:09:59 -06:00
Carlo Lucibello
425fcdbe69 NNlib docs + misc docs improvements 2020-02-29 11:14:48 +01:00
Adarsh Kumar
8afed01345
Apply suggestions from code review
Co-Authored-By: David Lung <lungd@users.noreply.github.com>
2020-02-27 23:23:53 +05:30
Adarsh Kumar
9dce623214
Updated Msle loss 2020-02-27 16:26:17 +05:30
Adarsh Kumar
980ce72914
Added tversky and dice loss 2020-02-27 02:00:28 +05:30
bors[bot]
55616afc11
Merge #960
960: Added utility function outdims to compute output dimensions of a layer r=dhairyagandhi96 a=darsnack

Based on Slack chatter, I added a utility function, `outdims`, that computes the output dimensions for given input dimensions.

Example
```julia
layer = Conv((3, 3), 3 => 16)
outdims(layer, (10, 10)) # returns (8, 8)
```

Co-authored-by: Kyle Daruwalla <daruwalla@wisc.edu>
2020-02-25 17:40:05 +00:00
Bulat Suleymanov
db4eaf254b
Edit description of convolutional layer 2020-02-24 13:16:51 +05:00
Kyle Daruwalla
924b8f49ec Updated to place function definitions in the appropriate places. 2020-02-21 15:10:28 -06:00
Kyle Daruwalla
7c12af065a Added testmode! functionality back to normalization layers. 2020-02-21 14:35:10 -06:00
Marco
ae0455517a Remove outdated reference to truncate! 2020-02-10 00:03:11 -08:00
Adarsh Kumar
7710bb0b4b
Removed spurious promotions 2020-02-06 01:06:41 +05:30
Adarsh Kumar
b5184553d4
Error correction in mae 2020-02-05 23:32:55 +05:30
Adarsh Kumar
643086c8db
Updated squared_hinge 2020-02-05 22:40:07 +05:30
Adarsh Kumar
7ac647a7ac
Added loss functions 2020-02-05 22:29:15 +05:30
bors[bot]
d1edd9b16d
Merge #680
680: Added new loss functions. r=thebhatman a=thebhatman

I have added the KL Divergence Loss function, Poisson loss function, Logcosh loss, and Hinge loss function.

Co-authored-by: Manjunath Bhat <manjunathbhat9920@gmail.com>
Co-authored-by: thebhatman <manjunathbhat9920@gmail.com>
2020-01-13 15:46:25 +00:00
Kyle Daruwalla
0cdd11c0dc Added tests for varying padding, stride, and dilation with outdims. 2019-12-07 14:05:50 -06:00
Kyle Daruwalla
a64378b112 Switched to using NNlib for conv.jl outdims. 2019-12-07 13:21:26 -06:00
Kyle Daruwalla
6265b1fa39 Added tests for outdims 2019-12-05 22:54:25 -06:00
Kyle Daruwalla
31dda0ce6c Updated with all basic and conv layers outdims 2019-12-05 21:57:10 -06:00
Kyle Daruwalla
b4ed16ad9c Added outdims for some basic layers 2019-12-03 22:48:48 -06:00
bors[bot]
fb4a48f970
Merge #943
943: Fixes #900 r=MikeInnes a=dhairyagandhi96

Thoughts on the test?

cc @MikeInnes

Co-authored-by: Dhairya Gandhi <dhairya@juliacopmuting.com>
2019-11-26 15:09:27 +00:00
Dhairya Gandhi
59bb0d81b0 add TODO 2019-11-26 16:23:09 +05:30
Dhairya Gandhi
5f21238d1a no grad dims helper 2019-11-24 13:25:02 +05:30
matsueushi
a0314ce682 Fix logitbinarycrossentropy on CuArrays 2019-11-22 05:23:24 +00:00
dsweber2
dea29532ef Merge branch 'master' into activations 2019-11-15 17:19:43 -08:00
dsweber2
20eb840882 keeping activations separate 2019-11-15 12:03:08 -08:00
dsweber2
58c794702d simpler test 2019-11-14 14:05:53 -08:00
dsweber2
0fe3ac4e77 bring activations into function call 2019-11-14 13:40:52 -08:00
dsweber2
6475f6a43e recursive way of doing activations 2019-11-14 13:40:52 -08:00
dsweber2
99679f7e16 deal with empty Chain 2019-11-14 13:40:52 -08:00