Commit Graph

326 Commits

Author SHA1 Message Date
Dhairya Gandhi 37d58e16dd common questions answered in docs 2020-02-08 16:33:18 +05:30
Lyndon White 7797e31b44
Add custom training loops to docs 2020-01-16 21:57:59 +00:00
bors[bot] d1edd9b16d
Merge #680
680: Added new loss functions. r=thebhatman a=thebhatman

I have added the KL Divergence Loss function, Poisson loss function, Logcosh loss, and Hinge loss function.

Co-authored-by: Manjunath Bhat <manjunathbhat9920@gmail.com>
Co-authored-by: thebhatman <manjunathbhat9920@gmail.com>
2020-01-13 15:46:25 +00:00
Manjunath Bhat 8a93be8c6c
Change loss to cost 2019-12-09 20:39:46 +05:30
Kyle Daruwalla 04991d3261 Added entry to docs for outdims 2019-12-07 14:06:11 -06:00
Helios De Rosario a0e3729679
Update docs/src/training/training.md
Co-Authored-By: Mike J Innes <mike.j.innes@gmail.com>
2019-11-15 21:17:45 +01:00
Helios De Rosario ba4e3be0d3
explanations about params in `train!` 2019-11-14 16:22:31 +01:00
Helios De Rosario 074eb47246
Update training.md 2019-11-12 23:29:38 +01:00
Helios De Rosario 7e1ffd6507
Extend docs about `train!`
Related to #921: explain why it is not needed to pass the model as argument.
2019-11-08 21:39:00 +01:00
Dhairya Gandhi 776023ddad fixes 2019-10-10 20:35:28 +05:30
Dhairya Gandhi 4477dd8d54 reviews 2019-10-10 20:27:11 +05:30
Dhairya Gandhi a55878453c
typo
Co-Authored-By: Mike J Innes <mike.j.innes@gmail.com>
2019-10-10 20:16:29 +05:30
Dhairya Gandhi 623ee2c29c
typo
Co-Authored-By: Mike J Innes <mike.j.innes@gmail.com>
2019-10-10 20:16:00 +05:30
thebhatman d591b2b59e Removed colon and capitalised 2019-10-09 21:36:40 +05:30
thebhatman 96a23c295c Changes to docs 2019-10-09 14:53:03 +05:30
Manjunath Bhat 2b30319a55
Merge branch 'master' into patch-6 2019-09-30 21:05:02 +05:30
thebhatman ec35e9cbaa Loss functions docs added in layers.md 2019-09-30 21:02:13 +05:30
thebhatman 6e289ef939 Merge branch 'patch-6' of https://github.com/thebhatman/Flux.jl into patch-6 2019-09-30 20:55:44 +05:30
Dhairya Gandhi 32ac71734d optimiser interface docs 2019-09-27 21:43:59 +05:30
Dhairya Gandhi a98a1b8bb5 fixes 2019-09-27 21:43:39 +05:30
Mike Innes cabb81e30b internal rename 2019-09-19 15:53:31 +01:00
Naba7 a600a9ceed removed extra parenthesis 2019-09-14 10:56:17 +05:30
Dhairya Gandhi e0276139e1
Update docs/src/training/optimisers.md
Co-Authored-By: Mike J Innes <mike.j.innes@gmail.com>
2019-09-11 19:21:15 +05:30
Dhairya Gandhi b6926f07a5 cleanup 2019-09-11 19:18:50 +05:30
Dhairya Gandhi b08c949b99 fixes to saving 2019-09-11 14:25:46 +05:30
Dhairya Gandhi a9d1cbf07c added decays 2019-09-10 21:20:05 +05:30
Dhairya Gandhi b6c8312796 optimiser docs 2019-09-10 20:49:15 +05:30
Mike Innes de2049450b docs mostly fixed 2019-09-10 15:17:07 +01:00
Mike Innes ddf06af0b9 remove tracker docs 2019-09-10 15:03:08 +01:00
Mike Innes c8d460ff84 doctests passing 2019-09-10 15:02:43 +01:00
Mike J Innes 67c38b3099 Merge branch 'master' into zygote 2019-09-06 15:18:58 +01:00
Mike Innes 62ec01a6f5 doc build changes 2019-08-19 15:49:50 +01:00
Mike J Innes bab618d168
Merge pull request #767 from oxinabox/patch-6
Some cleanup on performance tips docs
2019-07-11 16:11:44 +01:00
Mike J Innes 27904d349c
Update performance.md 2019-07-11 16:11:32 +01:00
Jason Wu b24e05bb20
Fix lack of x 2019-07-02 13:15:54 -04:00
Alex Mellnik e17999f19b Two minor typos 2019-06-11 22:09:59 -07:00
Mike J Innes b98075817c
Merge branch 'master' into DenseBlock 2019-06-05 14:27:47 +01:00
Lyndon White fe759ac43c
Update docs/src/performance.md
Co-Authored-By: Kristoffer Carlsson <kristoffer.carlsson@chalmers.se>
2019-05-28 14:19:56 +01:00
ayush-1506 f263f0c8ed add to layer docs 2019-05-14 02:53:06 -07:00
Bruno Hebling Vieira 6b3cd825b9 Added SkipConnection to docs tentatively in Other General Purporse Layers 2019-05-13 16:43:14 -03:00
Johnny Chen 7103a61a1f
delete redundant section 2019-05-11 12:40:01 +08:00
Lyndon White fc4827c48f
Some cleanup on performance tips 2019-05-07 16:38:21 +01:00
Elliot Saba c9148194cf Update `docs/` Manifest 2019-04-25 10:22:29 -07:00
Dhairya Gandhi 77e3ff7a8c fixed docs 2019-04-24 21:16:31 +05:30
Michael Green 1eca23e113 Merge branch 'master' of https://github.com/FluxML/Flux.jl 2019-04-20 11:26:24 +02:00
Michael Green 934f7f932d Updated docs again. 2019-04-20 11:22:48 +02:00
thebhatman 710084ffbf Loss functions added to docs 2019-04-05 23:50:16 +05:30
Shreyas 2a6eb35a71 Added GroupNorm to docs and News.md 2019-04-05 23:16:46 +05:30
Hossein Pourbozorg cad2df2c41
add other optimizers to documentation 2019-04-05 01:25:21 +04:30
Michael Green a5c34e8325 Fixed merging with upstream Flux. 2019-03-27 20:30:31 +01:00
Michael Green d68866a238 Fixed documentation error. 2019-03-27 20:22:01 +01:00
Mike J Innes ab46da11c7
Merge pull request #685 from jpsamaroo/jps/recur-docs-reset
Add note on reset! usage in recurrence docs
2019-03-27 12:47:01 +00:00
Julian P Samaroo 1930f40dec Add note on reset! usage in recurrence docs 2019-03-26 00:00:00 -05:00
Mike J Innes eeed8b24c3
Merge pull request #681 from dellison/stopdoc
add Flux.stop to training docs
2019-03-25 15:07:07 +00:00
Nick Robinson 025d9b678d Update docs/src/models/layers.md
Co-Authored-By: oxinabox <oxinabox@ucc.asn.au>
2019-03-18 12:20:46 +00:00
Nick Robinson 2bc4b8d1a4 Update docs/src/models/layers.md
Co-Authored-By: oxinabox <oxinabox@ucc.asn.au>
2019-03-18 12:20:46 +00:00
Lyndon White c76b9c7e2c fix docs 2019-03-18 12:20:46 +00:00
Lyndon White 838047f708 fix docs 2019-03-18 12:19:44 +00:00
Lyndon White c1a33c556f do things to docs 2019-03-18 12:19:44 +00:00
David Ellison 263a3248f6 add Flux.stop to training docs 2019-03-11 19:52:05 -07:00
Mike J Innes b348e31f07
Merge pull request #667 from FluxML/donottrack
rm Tracker
2019-03-08 11:38:37 +00:00
Mike J Innes 194e2ecd50 update docs manifest 2019-03-08 11:20:39 +00:00
Manjunath Bhat 922e9c9bc2
Updated docs with AlphaDropout 2019-03-04 01:10:12 +05:30
Lyndon White ebf50f4e1c Create performance tips docs section (#615)
* Create performance_tips.jl

* Rename performance_tips.jl to performance_tips.md

* add perf tips

* Update docs/src/performance_tips.md

Co-Authored-By: oxinabox <oxinabox@ucc.asn.au>

* Update docs/src/performance_tips.md

Co-Authored-By: oxinabox <oxinabox@ucc.asn.au>

* Update make.jl

* Update and rename performance_tips.md to performance.md

* spelling

* Update docs/src/performance.md

Co-Authored-By: oxinabox <oxinabox@ucc.asn.au>

* Update docs/src/performance.md

Co-Authored-By: oxinabox <oxinabox@ucc.asn.au>

* Update performance.md

* Update performance.md

* Update docs/src/performance.md

Co-Authored-By: oxinabox <oxinabox@ucc.asn.au>

* Update docs/src/performance.md

Co-Authored-By: oxinabox <oxinabox@ucc.asn.au>
2019-02-19 15:03:41 +00:00
Ayan Banerjee 08b87e0bce Transition to doctests (#616)
* basics.md: Initial doctest to an example

Related to https://github.com/FluxML/Flux.jl/issues/561

* make.jl: Allow doctest to run

* Fix comments in order to pass doctests

* basic.md: Add doctests to examples
2019-02-14 18:29:27 +00:00
Avik Pal c093d089a6
Add conv_transpose to docs 2019-02-06 21:11:41 +05:30
Mike J Innes 8386a49bf9
Merge pull request #575 from FluxML/mji/update
Clean up parameter update API
2019-01-28 15:26:57 +00:00
Mike J Innes e1cac76a34 params update 2019-01-28 14:14:41 +00:00
Mike J Innes 0f8a4a48c6 extend update! with an optimiser 2019-01-28 14:10:09 +00:00
Mike J Innes bb2210f552
Merge pull request #553 from xiaodaigh/patch-2
Updated with more detailed instructions for installing CuArrays
2019-01-28 10:36:27 +00:00
susabi 5930ac1730
simplified instructions 2019-01-26 12:26:48 +11:00
Arnaud Amzallag 3cc3c463a3
Adding `nest = true` option in `Tracker.gradient`
otherwise fails and returns an error. Note that option has to be added in both `df` and `d2f`.
2019-01-24 19:29:29 -05:00
Ayan Banerjee bc68dfbd75
docs/basics.md: Add `using Flux`
In order to import sigmoid function.
2019-01-23 19:20:10 +05:30
Ayan Banerjee 236b103b73
docs/basics.md: Add `tracked` after 1.0 2019-01-22 23:37:34 +05:30
susabi 3f62bc30b9
Update gpu.md 2019-01-11 15:57:54 +11:00
susabi e13c6c1125
updated gpu.md with installation instructions 2019-01-11 15:55:39 +11:00
Kristoffer Carlsson 2298e4fea1 modernize documentation 2019-01-10 15:06:11 +01:00
Mike J Innes f0d5624ed2
Merge pull request #493 from dhairyagandhi96/master
[WIP] New Optimiser Docs
2019-01-10 11:10:38 +00:00
Mike J Innes 81e5551256 tweaks 2019-01-10 11:01:57 +00:00
Kristoffer Carlsson 202424d1b1
Docs: fix link to CuArrays 2019-01-03 01:25:25 +01:00
Robert Hönig 0f243dba29
Correct CuArrays requirements.
According to the CuArrays README, "CuArrays should work out-of-the-box on Julia 1.0."
Correct the outdated Julia 0.6 requirement. Also, update the instructions link to point to the
CuArrays.jl README, which has setup instructions (CUDAnative.jl's README doesn't).
2018-12-19 09:23:26 +01:00
Dhairya Gandhi eb287ae9a0 fixed optimisers syntax 2018-12-04 16:08:03 +05:30
Dhairya Gandhi d412845192 added training api changes 2018-12-01 16:59:27 +05:30
Dhairya Gandhi 1ea8c5a293 [WIP] add docstrings and doc improvements 2018-11-12 19:17:10 +05:30
Dhairya Gandhi 07397bc950 [WIP] add links to sgd 2018-11-12 17:53:53 +05:30
Dhairya Gandhi 4562682528 [WIP] add optimiser docs 2018-11-12 17:42:52 +05:30
Mike J Innes bbccdb3eec
Merge pull request #279 from avik-pal/depthwiseconv
Adds support for Depthwise Convolutions
2018-10-23 17:22:15 +01:00
harryscholes 61c14afee4 Add usage example of custom gradients 2018-10-09 13:05:38 +01:00
Harry 179a1e8407
Correct Custom Gradients docs
* Fixed a type signature that was incorrect.
* Also, replaced `data(a)` with `a.data`. Don't know if the syntax has changed (recently). This may also need to be corrected in line 121.

MWE:

```julia
using Flux
using Flux.Tracker
using Flux.Tracker: forward, TrackedReal, track, @grad

minus(a, b) = a - b
minus(a::TrackedReal, b::TrackedReal) = Tracker.track(minus, a, b)
@grad function minus(a, b)
    return minus(a.data, b.data), Δ -> (Δ, -Δ)
end

a, b = param(2), param(4)
c = minus(a, b)  # -2.0 (tracked)
Tracker.back!(c)

Tracker.grad(a)  # 1.00
Tracker.grad(b)  # -1.00
```
2018-09-21 16:57:54 +01:00
Harry 079614adb2
Fix typo 2018-09-19 16:45:11 +01:00
Avik Pal eb9b408c0f
Merge branch 'master' into depthwiseconv 2018-09-15 10:21:31 +05:30
Sambit Kumar Dash 8b9a98ed01
The sample gradient should not use the softdash
While softdash is a very natural and mathematical way of representation, it can be very easily confused with the apostrophe used for LinAlg adjoint. Not worth and unnecessary confusion in a first example of the code.
2018-09-11 18:58:07 +05:30
Mike J Innes 395a35d137 better headings 2018-09-05 17:03:41 +01:00
Mike J Innes 193c4ded19 make docs on 1.0 2018-09-05 16:52:50 +01:00
Mike J Innes b7eaf393fc docs updates 2018-09-05 16:01:57 +01:00
Mike J Innes 41cf1f2a84
Merge pull request #381 from piever/pv/docs
fix julia 1 changes in tutorial
2018-09-04 16:00:58 +01:00
Mike J Innes e6be639436 Merge branch 'master' into HEAD 2018-09-04 14:03:46 +01:00
Pietro Vertechi a012d0bd51 fix vecnorm in docs 2018-08-29 23:39:43 +01:00
Pietro Vertechi abcefb8ae3 fix foldl in tutorial 2018-08-29 18:36:24 +01:00
Yueh-Hua Tu 634d34686e Add new constructors and test 2018-08-24 10:31:13 +08:00