Commit Graph

289 Commits

Author SHA1 Message Date
bors[bot]
be38146ee9
Merge #1061
1061: fix a few typos in docstrings r=CarloLucibello a=visr



Co-authored-by: Martijn Visser <mgvisser@gmail.com>
2020-03-02 01:03:58 +00:00
bors[bot]
6575fb8f48
Merge #1057
1057: add Julia ecosystem doc section r=CarloLucibello a=CarloLucibello

Partially fixing #251,  related to the discussion in #1051 .

Not exactly a poem that I wrote here, maybe someone could suggest a better rephrasing. 
Suggestion for additional packages to add to the list also welcome

Co-authored-by: CarloLucibello <carlo.lucibello@gmail.com>
2020-03-02 00:52:22 +00:00
Kyle Daruwalla
4cebf36361
Merge branch 'master' into feature/istraining 2020-03-01 12:32:15 -06:00
Kyle Daruwalla
c001d0f3c5 Added trainmode! and updated docs with warning 2020-03-01 12:30:41 -06:00
Martijn Visser
6076847a45 fix a few typos in docstrings 2020-03-01 15:07:12 +01:00
CarloLucibello
b6c79b38b4 add DataLoader
special case train! for the unsupervised data iterator
2020-02-29 18:50:59 +01:00
CarloLucibello
4f693e02cb add model zoo reference 2020-02-29 13:50:23 +01:00
CarloLucibello
4109f2e0d7 cleanup 2020-02-29 13:45:17 +01:00
CarloLucibello
169ed6eb25 add ecosystem 2020-02-29 13:43:03 +01:00
Carlo Lucibello
425fcdbe69 NNlib docs + misc docs improvements 2020-02-29 11:14:48 +01:00
CarloLucibello
759fe9df2f update docs and export update! 2020-02-26 20:27:39 +01:00
Kyle Daruwalla
ba5259a269 Added docs on testmode! 2020-02-25 13:53:49 -06:00
bors[bot]
55616afc11
Merge #960
960: Added utility function outdims to compute output dimensions of a layer r=dhairyagandhi96 a=darsnack

Based on Slack chatter, I added a utility function, `outdims`, that computes the output dimensions for given input dimensions.

Example
```julia
layer = Conv((3, 3), 3 => 16)
outdims(layer, (10, 10)) # returns (8, 8)
```

Co-authored-by: Kyle Daruwalla <daruwalla@wisc.edu>
2020-02-25 17:40:05 +00:00
Kyle Daruwalla
f5b9cf659c Updated docs to specify exactly what layers support outdims 2020-02-20 23:38:56 -06:00
Lyndon White
7797e31b44
Add custom training loops to docs 2020-01-16 21:57:59 +00:00
bors[bot]
d1edd9b16d
Merge #680
680: Added new loss functions. r=thebhatman a=thebhatman

I have added the KL Divergence Loss function, Poisson loss function, Logcosh loss, and Hinge loss function.

Co-authored-by: Manjunath Bhat <manjunathbhat9920@gmail.com>
Co-authored-by: thebhatman <manjunathbhat9920@gmail.com>
2020-01-13 15:46:25 +00:00
Manjunath Bhat
8a93be8c6c
Change loss to cost 2019-12-09 20:39:46 +05:30
Kyle Daruwalla
04991d3261 Added entry to docs for outdims 2019-12-07 14:06:11 -06:00
Helios De Rosario
a0e3729679
Update docs/src/training/training.md
Co-Authored-By: Mike J Innes <mike.j.innes@gmail.com>
2019-11-15 21:17:45 +01:00
Helios De Rosario
ba4e3be0d3
explanations about params in train! 2019-11-14 16:22:31 +01:00
Helios De Rosario
074eb47246
Update training.md 2019-11-12 23:29:38 +01:00
Helios De Rosario
7e1ffd6507
Extend docs about train!
Related to #921: explain why it is not needed to pass the model as argument.
2019-11-08 21:39:00 +01:00
Dhairya Gandhi
776023ddad fixes 2019-10-10 20:35:28 +05:30
Dhairya Gandhi
4477dd8d54 reviews 2019-10-10 20:27:11 +05:30
Dhairya Gandhi
a55878453c
typo
Co-Authored-By: Mike J Innes <mike.j.innes@gmail.com>
2019-10-10 20:16:29 +05:30
Dhairya Gandhi
623ee2c29c
typo
Co-Authored-By: Mike J Innes <mike.j.innes@gmail.com>
2019-10-10 20:16:00 +05:30
thebhatman
d591b2b59e Removed colon and capitalised 2019-10-09 21:36:40 +05:30
thebhatman
96a23c295c Changes to docs 2019-10-09 14:53:03 +05:30
Manjunath Bhat
2b30319a55
Merge branch 'master' into patch-6 2019-09-30 21:05:02 +05:30
thebhatman
ec35e9cbaa Loss functions docs added in layers.md 2019-09-30 21:02:13 +05:30
thebhatman
6e289ef939 Merge branch 'patch-6' of https://github.com/thebhatman/Flux.jl into patch-6 2019-09-30 20:55:44 +05:30
Dhairya Gandhi
32ac71734d optimiser interface docs 2019-09-27 21:43:59 +05:30
Dhairya Gandhi
a98a1b8bb5 fixes 2019-09-27 21:43:39 +05:30
Mike Innes
cabb81e30b internal rename 2019-09-19 15:53:31 +01:00
Naba7
a600a9ceed removed extra parenthesis 2019-09-14 10:56:17 +05:30
Dhairya Gandhi
e0276139e1
Update docs/src/training/optimisers.md
Co-Authored-By: Mike J Innes <mike.j.innes@gmail.com>
2019-09-11 19:21:15 +05:30
Dhairya Gandhi
b6926f07a5 cleanup 2019-09-11 19:18:50 +05:30
Dhairya Gandhi
b08c949b99 fixes to saving 2019-09-11 14:25:46 +05:30
Dhairya Gandhi
a9d1cbf07c added decays 2019-09-10 21:20:05 +05:30
Dhairya Gandhi
b6c8312796 optimiser docs 2019-09-10 20:49:15 +05:30
Mike Innes
de2049450b docs mostly fixed 2019-09-10 15:17:07 +01:00
Mike Innes
ddf06af0b9 remove tracker docs 2019-09-10 15:03:08 +01:00
Mike Innes
c8d460ff84 doctests passing 2019-09-10 15:02:43 +01:00
Mike J Innes
67c38b3099 Merge branch 'master' into zygote 2019-09-06 15:18:58 +01:00
Mike Innes
62ec01a6f5 doc build changes 2019-08-19 15:49:50 +01:00
Mike J Innes
bab618d168
Merge pull request #767 from oxinabox/patch-6
Some cleanup on performance tips docs
2019-07-11 16:11:44 +01:00
Mike J Innes
27904d349c
Update performance.md 2019-07-11 16:11:32 +01:00
Jason Wu
b24e05bb20
Fix lack of x 2019-07-02 13:15:54 -04:00
Alex Mellnik
e17999f19b Two minor typos 2019-06-11 22:09:59 -07:00
Mike J Innes
b98075817c
Merge branch 'master' into DenseBlock 2019-06-05 14:27:47 +01:00