Commit Graph

1687 Commits

Author SHA1 Message Date
Garben Tanghe
82e16a5b29 split up Flatten layer to use the flatten function 2020-03-08 14:21:59 +01:00
Garben Tanghe
3e14bd878c added GlobalMaxPool, GlobalMeanPool, and Flatten layers 2020-03-08 14:18:48 +01:00
Dhairya Gandhi
d8e44fcc1c correct broadcasting for addition 2020-03-04 18:22:45 +05:30
Dhairya Gandhi
7e308e77fd rm unneccesary fns 2020-03-04 17:57:16 +05:30
Dhairya Gandhi
5a4f1932a6 closes #1071 2020-03-04 17:22:45 +05:30
bors[bot]
94ba1e8ede
Merge #1028 #1070
1028: Common questions answered in docs r=CarloLucibello a=dhairyagandhi96

cc @MikeInnes 

1070: Prevent breakage due to new `active` field in normalise layers r=CarloLucibello a=ianshmean

Prevents breakage where the normalise structs, such as `BatchNorm`, have been directly defined but missing the new `active` field

cc. @darsnack 

Co-authored-by: Dhairya Gandhi <dhairya@juliacopmuting.com>
Co-authored-by: Dhairya Gandhi <dhairya@juliacomputing.com>
Co-authored-by: Ian <i.r.butterworth@gmail.com>
2020-03-04 00:10:39 +00:00
bors[bot]
af23a5756c
Merge #1053
1053: Added Some Loss functions with some doc improvements r=CarloLucibello a=AdarshKumar712

Added the following loss functions with tests:
1. mae
2. mean squared logarithmic error
3. huber loss
4. squared hinge loss
5. dice coeff loss
6. tversky loss 

Also added some documentation improvements for few other functions. 

Co-authored-by: Adarsh Kumar <45385384+AdarshKumar712@users.noreply.github.com>
2020-03-03 23:56:21 +00:00
Ian
61f66e3dcd remove unnecessary helper for AlphaDropout 2020-03-03 13:20:02 -05:00
Ian
d63fcf2cb4 add depreciation reminder 2020-03-03 13:05:03 -05:00
Ian
d9ea5fba76 add active helpers for other normalise layers 2020-03-03 11:55:39 -05:00
Ian
0def352383 Prevent breakage due to new active field in BatchNorm 2020-03-03 11:49:34 -05:00
Adarsh Kumar
6e5c18bddf
Updated loss functions 2020-03-03 16:02:57 +05:30
bors[bot]
4acc907723
Merge #1065
1065: update documenter r=CarloLucibello a=CarloLucibello



Co-authored-by: CarloLucibello <carlo.lucibello@gmail.com>
2020-03-03 07:20:03 +00:00
bors[bot]
df73b8b8fb
Merge #1064
1064: Include cuda/cuda.jl during precompilation? r=CarloLucibello a=ianshmean

Loading `cuda/cuda.jl` at run-time during `__init__()` seems to be causing issues with PackageCompiler. (see error at bottom).

I'm wondering the cost of loading `cuda/cuda.jl` is negligible enough to just do it in all cases and get it precompiled. Setting `Flux.use_cuda[]` would stil be used  for switching cuda on or off. 

Load time in 1.3.1 on my mac (without cuda):

This PR:
```
julia> @time using Flux
[ Info: Precompiling Flux [587475ba-b771-5e3f-ad9e-33799f191a9c]
[ Info: CUDAdrv.jl failed to initialize, GPU functionality unavailable (set JULIA_CUDA_SILENT or JULIA_CUDA_VERBOSE to silence or expand this message)
 37.313982 seconds (56.30 M allocations: 2.822 GiB, 2.52% gc time)
...
julia> @time using Flux
[ Info: CUDAdrv.jl failed to initialize, GPU functionality unavailable (set JULIA_CUDA_SILENT or JULIA_CUDA_VERBOSE to silence or expand this message)
 22.111054 seconds (52.93 M allocations: 2.663 GiB, 3.99% gc time)
```
Master:
```
julia> @time using Flux
[ Info: Precompiling Flux [587475ba-b771-5e3f-ad9e-33799f191a9c]
[ Info: CUDAdrv.jl failed to initialize, GPU functionality unavailable (set JULIA_CUDA_SILENT or JULIA_CUDA_VERBOSE to silence or expand this message)
 35.750143 seconds (53.73 M allocations: 2.698 GiB, 2.51% gc time)
...
julia> @time using Flux
[ Info: CUDAdrv.jl failed to initialize, GPU functionality unavailable (set JULIA_CUDA_SILENT or JULIA_CUDA_VERBOSE to silence or expand this message)
 26.267999 seconds (52.92 M allocations: 2.660 GiB, 3.67% gc time)
```


I didn't make `include("cuda/cuda.jl")` dependent  on `CuArrays.functional()` because I guess there could be a case where, say, a user doesn't have cuda installed, loads Flux, installs cuda, reloads Flux.. where the 2nd time the package isn't re-precompiled.

The PackageCompiler error, which doesn't happen every time. It just seems that the runtime loading of cuda.jl  may be introducing dep tracking issues (?)
```
┌ Warning: Package Zygote does not have InteractiveUtils in its dependencies:
│ - If you have Zygote checked out for development and have
│   added InteractiveUtils as a dependency but haven't updated your primary
│   environment's manifest file, try `Pkg.resolve()`.
│ - Otherwise you may need to report an issue with Zygote
└ Loading InteractiveUtils into Zygote from project dependency, future warnings for Zygote are suppressed.
fatal: error thrown and no exception handler available.
#<null>
require at ./loading.jl:905
_jl_invoke at /home/ian/Documents/julia-kf-31156/src/gf.c:2161 [inlined]
jl_apply_generic at /home/ian/Documents/julia-kf-31156/src/gf.c:2328
jl_apply at /home/ian/Documents/julia-kf-31156/src/julia.h:1695 [inlined]
call_require at /home/ian/Documents/julia-kf-31156/src/toplevel.c:399 [inlined]
eval_import_path at /home/ian/Documents/julia-kf-31156/src/toplevel.c:436
eval_import_from at /home/ian/Documents/julia-kf-31156/src/toplevel.c:557
jl_toplevel_eval_flex at /home/ian/Documents/julia-kf-31156/src/toplevel.c:646
jl_eval_module_expr at /home/ian/Documents/julia-kf-31156/src/toplevel.c:181
jl_toplevel_eval_flex at /home/ian/Documents/julia-kf-31156/src/toplevel.c:640
jl_parse_eval_all at /home/ian/Documents/julia-kf-31156/src/ast.c:907
jl_load_rewrite at /home/ian/Documents/julia-kf-31156/src/toplevel.c:872
include at ./Base.jl:380
include at ./Base.jl:368 [inlined]
include at /home/ian/.julia/packages/Flux/p8ZLv/src/Flux.jl:1 [inlined]
__init__ at /home/ian/.julia/packages/Flux/p8ZLv/src/Flux.jl:56
jfptr___init___22072 at /home/ian/Documents/MyPackage.jl/dev/compilation/MyPackageSysImage.so (unknown line)
_jl_invoke at /home/ian/Documents/julia-kf-31156/src/gf.c:2161 [inlined]
jl_apply_generic at /home/ian/Documents/julia-kf-31156/src/gf.c:2328
jl_apply at /home/ian/Documents/julia-kf-31156/src/julia.h:1695 [inlined]
jl_module_run_initializer at /home/ian/Documents/julia-kf-31156/src/toplevel.c:74
_julia_init at /home/ian/Documents/julia-kf-31156/src/init.c:788
unknown function (ip: 0x5594b1667f)
__libc_start_main at /lib/aarch64-linux-gnu/libc.so.6 (unknown line)
unknown function (ip: 0x5594b16733)
unknown function (ip: 0x5594b16733)
```

Co-authored-by: Ian <i.r.butterworth@gmail.com>
2020-03-03 07:07:54 +00:00
CarloLucibello
af99ca27ee docs update 2020-03-03 07:52:20 +01:00
Adarsh Kumar
2f05094068
Added consistency with ŷ and unicode chars 2020-03-02 20:00:47 +05:30
Adarsh Kumar
f9e31a020c
Updated huber_loss with other minute changes 2020-03-02 13:25:23 +05:30
bors[bot]
be38146ee9
Merge #1061
1061: fix a few typos in docstrings r=CarloLucibello a=visr



Co-authored-by: Martijn Visser <mgvisser@gmail.com>
2020-03-02 01:03:58 +00:00
Ian
7555e488c6 tweaks 2020-03-01 19:40:03 -05:00
Ian
9b2f4919ee includ cuda/cuda.jl during precompile, even if cuda isn't detected 2020-03-01 19:33:23 -05:00
bors[bot]
3cf131b8de
Merge #1062
1062: docstring ensure signature code formatting r=CarloLucibello a=visr

by using a four space indent instead of two

Fixes issues seen here:

![image](https://user-images.githubusercontent.com/4471859/75627427-54aa6600-5bd0-11ea-93d3-92901d44db59.png)

Where the type signature has no code formatting, and a code block is introduced that throws off the rest of the formatting.

Co-authored-by: Martijn Visser <mgvisser@gmail.com>
2020-03-01 22:28:10 +00:00
Kyle Daruwalla
23f791e32b Add "during X phase" phrasing to testmode!/trainmode! docstring. 2020-03-01 12:49:30 -06:00
Kyle Daruwalla
35e460b044 Fixed broken @ref in docstring 2020-03-01 12:44:36 -06:00
Kyle Daruwalla
4cebf36361
Merge branch 'master' into feature/istraining 2020-03-01 12:32:15 -06:00
Kyle Daruwalla
c001d0f3c5 Added trainmode! and updated docs with warning 2020-03-01 12:30:41 -06:00
Martijn Visser
d67a2e40b3 remove stray code block start from docstring 2020-03-01 15:20:40 +01:00
Martijn Visser
f4365dab94 fix docstring example indentation as well 2020-03-01 15:19:22 +01:00
Martijn Visser
32e0aa9fcb docstring ensure signature code formatting
by using a four space indent instead of two
2020-03-01 15:15:39 +01:00
Martijn Visser
6076847a45 fix a few typos in docstrings 2020-03-01 15:07:12 +01:00
Adarsh Kumar
08dabce57e
Updated loss function docs 2020-03-01 12:00:11 +05:30
Adarsh Kumar
57c1b67d08
Merge branch 'master' into patch-1 2020-03-01 11:49:33 +05:30
Kyle Daruwalla
5cbd2cecf2 Changed testmode! to return model 2020-02-29 16:09:59 -06:00
CarloLucibello
a72258ea2a fix rebase 2020-02-29 18:55:49 +01:00
CarloLucibello
97141e8c98 improve docstring 2020-02-29 18:51:00 +01:00
CarloLucibello
487002878e restrict train! special casing 2020-02-29 18:51:00 +01:00
CarloLucibello
b6c79b38b4 add DataLoader
special case train! for the unsupervised data iterator
2020-02-29 18:50:59 +01:00
bors[bot]
37af9fb15c
Merge #1023
1023: Feature: Added Boston Housing Dataset r=CarloLucibello a=pranjaldatta

[Boston Housing Dataset](https://archive.ics.uci.edu/ml/machine-learning-databases/housing/) is one of the most common datasets that are used by beginners. It is as popular as other datasets like Iris etc. Hence, it feels only natural that this dataset is a part of Flux.

Added src/data/housing.jl: code for downloading and loading the dataset
Edited src/data/Data.jl: To include and export housing.jl
Edited test/data.jl: Added test for the dataset.

*All tests in test/data.jl are passing*

Co-authored-by: pranjaldatta <pranjaldatta99@gmail.com>
Co-authored-by: Pranjal  Datta <pranjaldatta99@gmail.com>
2020-02-29 15:54:34 +00:00
Carlo Lucibello
425fcdbe69 NNlib docs + misc docs improvements 2020-02-29 11:14:48 +01:00
Adarsh Kumar
8afed01345
Apply suggestions from code review
Co-Authored-By: David Lung <lungd@users.noreply.github.com>
2020-02-27 23:23:53 +05:30
Adarsh Kumar
9dce623214
Updated Msle loss 2020-02-27 16:26:17 +05:30
Adarsh Kumar
980ce72914
Added tversky and dice loss 2020-02-27 02:00:28 +05:30
CarloLucibello
759fe9df2f update docs and export update! 2020-02-26 20:27:39 +01:00
Dhairya Gandhi
20e78e274e docs fix 2020-02-26 22:41:45 +05:30
Dhairya Gandhi
cf82393ae8 type signatures 2020-02-26 22:36:25 +05:30
Dhairya Gandhi
cd931793ef more docs and constructors 2020-02-26 22:29:14 +05:30
Dhairya Gandhi
58211e31bd docs improve 2020-02-26 22:22:11 +05:30
Dhairya Gandhi
f889d0c4d4 add kwarg constructors 2020-02-26 22:19:17 +05:30
pranjaldatta
569021a9f1 added newlines at end of file 2020-02-26 15:05:23 +05:30
bors[bot]
55616afc11
Merge #960
960: Added utility function outdims to compute output dimensions of a layer r=dhairyagandhi96 a=darsnack

Based on Slack chatter, I added a utility function, `outdims`, that computes the output dimensions for given input dimensions.

Example
```julia
layer = Conv((3, 3), 3 => 16)
outdims(layer, (10, 10)) # returns (8, 8)
```

Co-authored-by: Kyle Daruwalla <daruwalla@wisc.edu>
2020-02-25 17:40:05 +00:00
Tim Besard
4ed7d984db Adapt to CuArrays ArrayStyle changes. 2020-02-25 14:09:03 +01:00
Bulat Suleymanov
db4eaf254b
Edit description of convolutional layer 2020-02-24 13:16:51 +05:00
Kyle Daruwalla
924b8f49ec Updated to place function definitions in the appropriate places. 2020-02-21 15:10:28 -06:00
Kyle Daruwalla
7c12af065a Added testmode! functionality back to normalization layers. 2020-02-21 14:35:10 -06:00
Dhairya Gandhi
88b0c65d72
Merge pull request #1035 from matsueushi/remove_get_macro
Remove get! macro
2020-02-20 12:58:16 +05:30
bors[bot]
e4a84c120f
Merge #1021
1021: nograd for onecold, onehot, onehotbatch r=MikeInnes a=CarloLucibello

fixes #1020 

Co-authored-by: CarloLucibello <carlo.lucibello@gmail.com>
2020-02-17 14:12:48 +00:00
matsueushi
6ea7b95384 Remove unused using 2020-02-15 20:06:15 -05:00
Marco
ae0455517a Remove outdated reference to truncate! 2020-02-10 00:03:11 -08:00
pranjaldatta
197a1a70c0 added BostonHousing dataset and testing 2020-02-07 03:47:19 +05:30
CarloLucibello
6499344af3 nograd for onecold, onehot, onehotbatch 2020-02-06 15:41:46 +01:00
Adarsh Kumar
7710bb0b4b
Removed spurious promotions 2020-02-06 01:06:41 +05:30
Adarsh Kumar
b5184553d4
Error correction in mae 2020-02-05 23:32:55 +05:30
Adarsh Kumar
643086c8db
Updated squared_hinge 2020-02-05 22:40:07 +05:30
Adarsh Kumar
7ac647a7ac
Added loss functions 2020-02-05 22:29:15 +05:30
Dhairya Gandhi
bc20103ea6 no-op copy 2020-01-31 13:23:33 +05:30
Dhairya Gandhi
b9fbee1ff0 ::typeof(op) -> op 2020-01-31 12:24:36 +05:30
Tim Besard
d88f63adb4 Remove unused imports. 2020-01-29 12:15:41 +01:00
bors[bot]
d1edd9b16d
Merge #680
680: Added new loss functions. r=thebhatman a=thebhatman

I have added the KL Divergence Loss function, Poisson loss function, Logcosh loss, and Hinge loss function.

Co-authored-by: Manjunath Bhat <manjunathbhat9920@gmail.com>
Co-authored-by: thebhatman <manjunathbhat9920@gmail.com>
2020-01-13 15:46:25 +00:00
Mike J Innes
17732e7023 restructure; closes #747 2020-01-06 11:53:47 +00:00
Dhairya Gandhi
a72ca2b05d fix args 2019-12-09 23:18:01 +05:30
Dhairya Gandhi
894c075b6d rm Zeros setindex 2019-12-09 21:40:58 +05:30
Dhairya Gandhi
f39e184814 rm Zeros warning 2019-12-09 21:07:30 +05:30
Kyle Daruwalla
0cdd11c0dc Added tests for varying padding, stride, and dilation with outdims. 2019-12-07 14:05:50 -06:00
Kyle Daruwalla
a64378b112 Switched to using NNlib for conv.jl outdims. 2019-12-07 13:21:26 -06:00
Kyle Daruwalla
6265b1fa39 Added tests for outdims 2019-12-05 22:54:25 -06:00
Kyle Daruwalla
31dda0ce6c Updated with all basic and conv layers outdims 2019-12-05 21:57:10 -06:00
DrChainsaw
755536bf5e Merge remote-tracking branch 'upstream/master' into samepad 2019-12-04 23:45:03 +01:00
Kyle Daruwalla
b4ed16ad9c Added outdims for some basic layers 2019-12-03 22:48:48 -06:00
Fredrik Bagge Carlson
e67f09c06d Correct some comments in decay docs 2019-12-03 15:32:23 +08:00
Fredrik Bagge Carlson
6e94e59afd Improve docs for decay optimisers 2019-12-03 15:27:44 +08:00
Dhairya Gandhi
245563077b cleaner API 2019-11-27 19:40:58 +05:30
bors[bot]
90a38a3201
Merge #937
937: Fix Glorot initialization, add He initialization r=MikeInnes a=Sleort

Should fix #442 .
Adds He weight initialization as a bonus :-)

Co-authored-by: Troels Arnfred Bojesen <tr-ab@online.no>
2019-11-26 16:17:06 +00:00
bors[bot]
fb4a48f970
Merge #943
943: Fixes #900 r=MikeInnes a=dhairyagandhi96

Thoughts on the test?

cc @MikeInnes

Co-authored-by: Dhairya Gandhi <dhairya@juliacopmuting.com>
2019-11-26 15:09:27 +00:00
Dhairya Gandhi
59bb0d81b0 add TODO 2019-11-26 16:23:09 +05:30
Mike J Innes
4c69b44a7c
Merge pull request #940 from matsueushi/feature/cuda-logitbc
Fix logitbinarycrossentropy on CuArrays
2019-11-26 10:18:07 +00:00
Tim Besard
fbb377a7b4
Merge pull request #941 from FluxML/tb/include_during_precompile
Don't include the CUDA module during precompilation.
2019-11-24 08:55:43 +01:00
Dhairya Gandhi
5f21238d1a no grad dims helper 2019-11-24 13:25:02 +05:30
Tim Besard
4ece13c649 Don't include the CUDA module during precompilation.
If we do, we could end up replacing it at runtime.
2019-11-22 18:03:51 +01:00
matsueushi
a0314ce682 Fix logitbinarycrossentropy on CuArrays 2019-11-22 05:23:24 +00:00
Troels Arnfred Bojesen
af96a197c1 Fix Glorot initialization
Should fix #442
2019-11-20 13:20:42 +09:00
Mike J Innes
5839e166f6
Merge pull request #860 from dsweber2/activations
Activations
2019-11-19 16:44:25 +00:00
Tim Besard
2fa3e5673e
Merge pull request #924 from FluxML/tb/cuda_init
CUDA package initialization improvements
2019-11-19 16:48:45 +01:00
Tim Besard
c45cec4cba Simplify warning. 2019-11-19 16:05:41 +01:00
Tim Besard
69bf84278f Remove wrong warning. 2019-11-19 15:53:43 +01:00
Mike J Innes
4f73e434a4
Merge pull request #935 from baggepinnen/patch-4
Fix AMSGrad on GPU
2019-11-19 12:58:37 +00:00
Troels Arnfred Bojesen
2b80573248 Fix Glorot initialization, add He initialization
Should fix #442 .
Adds He weight initialization as a bonus :-)
2019-11-19 18:16:29 +09:00
Fredrik Bagge Carlson
2da22f31f0
Avoid unnecessary conversion
This initialization works for both cpu and gpu
2019-11-19 16:31:04 +08:00
Fredrik Bagge Carlson
df7ffb0ef8
Fix AMSGrad on GPU
The previous initialization created a CPU array. Now, the same type of array as `x` is created.
2019-11-19 16:27:44 +08:00
Dhairya Gandhi
eb41715d26 define manual rules 2019-11-19 13:30:33 +05:30
Troels Arnfred Bojesen
4530ac65c7 Fix Glorot initialization, add He initialization
Should fix the issue reported at https://github.com/FluxML/Flux.jl/issues/442 .
Adds He weight initialization as a bonus :-)
2019-11-19 16:50:40 +09:00
dsweber2
dea29532ef Merge branch 'master' into activations 2019-11-15 17:19:43 -08:00