thebhatman
|
b84ab7ac95
|
Removed logcosh
|
2019-04-05 03:16:54 +05:30 |
|
thebhatman
|
4efcc69ba5
|
logcosh averaged
|
2019-03-26 23:23:02 +05:30 |
|
Manjunath Bhat
|
930adb122d
|
Avoided promotion to Float64 in hinge.
|
2019-03-25 23:43:06 +05:30 |
|
thebhatman
|
6f078857be
|
Added reference links to loss functions
|
2019-03-26 03:15:28 +05:30 |
|
thebhatman
|
c4d12e57fe
|
Loss function names in lowercase
|
2019-03-26 03:09:48 +05:30 |
|
Manjunath Bhat
|
633f0df01f
|
Added new loss functions.
|
2019-03-12 02:31:42 +05:30 |
|
pshashk
|
b0a5844afb
|
Remove dims=1 from normalise (#619)
* remove `dims=1`
* add dims arg
* fix test
* remove dims=1 only from deprecated version
|
2019-02-11 16:11:47 +00:00 |
|
pshashk
|
b074b2491a
|
fix docstring
|
2019-02-08 21:49:53 +03:00 |
|
pshashk
|
c3e04392d8
|
drop dims type restriction
|
2019-02-08 16:15:37 +03:00 |
|
pshashk
|
911c901294
|
dims kwarg
|
2019-02-08 16:00:32 +03:00 |
|
Moksh Jain
|
046f7b4eae
|
fix std arguments in normalise
|
2019-02-05 18:36:04 +05:30 |
|
Moksh Jain
|
c6409d7686
|
add support for n-dimensional input to normalise layer
|
2019-02-05 17:09:22 +05:30 |
|
Kristoffer Carlsson
|
fd0f1c7a82
|
use uncorrected standard deviation in normalise
fixes https://github.com/FluxML/Flux.jl/issues/529
|
2019-01-30 17:42:19 +01:00 |
|
Mike J Innes
|
1cf37ab9eb
|
rm some old deprecations
|
2019-01-25 09:54:32 +00:00 |
|
Kristoffer Carlsson
|
c74aa67c5d
|
fix promotion by avoiding integer division in mse and crossentropy
oops
add tests
|
2019-01-15 14:15:05 +01:00 |
|
James Bradbury
|
e7783ace12
|
1.0 compat for normalise
|
2018-09-06 18:38:11 -07:00 |
|
Mike Innes
|
5a023a9ccc
|
WIP 1.0 support
closes #353
|
2018-08-20 13:08:04 +01:00 |
|
Matthew Kelley
|
864d72eef5
|
Overload Base.eps() for TrackedReal
|
2018-06-26 23:55:43 -06:00 |
|
Matthew Kelley
|
0e95be3326
|
Call Flux.Tracker.data() on ŷ for bce
|
2018-06-26 14:48:51 -06:00 |
|
Matthew Kelley
|
ed032cdb1e
|
Change epsilon value to eps(ŷ)
|
2018-06-26 12:29:06 -06:00 |
|
Matthew Kelley
|
e08fd7a6d2
|
Added epsilon term to binarycrossentropy
|
2018-06-26 11:43:16 -06:00 |
|
Mike J Innes
|
8f73dc6e14
|
fix gpu cross entropy
|
2018-04-17 17:56:47 +01:00 |
|
Mike J Innes
|
8019f789f8
|
use normal log
|
2018-03-01 16:35:49 +00:00 |
|
Mike J Innes
|
ac57fc3c26
|
use @ fix in a few places
|
2018-03-01 16:31:20 +00:00 |
|
boathit
|
6e65789828
|
Register back! for logsigmoid and implement (logit)binarycrossentropy
|
2018-02-06 19:32:46 +08:00 |
|
Mike J Innes
|
e3a688e706
|
use kwarg
|
2017-12-13 15:27:15 +00:00 |
|
Elliot Saba
|
41446d547f
|
Add weighted_crossentropy for imbalanced classification problems
|
2017-12-05 17:09:05 -08:00 |
|
Mike J Innes
|
dc1f08a709
|
Merge pull request #98 from FluxML/log
GPU-ready log function
|
2017-11-23 17:17:39 +00:00 |
|
Mike J Innes
|
b06884b912
|
LayerNorm tweaks
|
2017-11-21 16:32:36 +01:00 |
|
skariel
|
11d53781b2
|
adding layer normalization
|
2017-11-21 16:30:24 +01:00 |
|
Mike J Innes
|
e0657d93ec
|
mv numeric.jl to nnlib
|
2017-11-09 15:06:29 +00:00 |
|
Mike J Innes
|
2cb94981a0
|
gpu-ready log
|
2017-11-09 15:04:01 +00:00 |
|
Mike J Innes
|
23674b2555
|
logitcrossentropy tweaks
|
2017-10-17 17:58:32 +01:00 |
|
pevnak
|
4aa7741ba9
|
logit cross entropy
|
2017-10-17 17:57:46 +01:00 |
|
Mike J Innes
|
6dff8ca8d3
|
rename crossentropy loss
|
2017-10-17 17:36:18 +01:00 |
|
Mike J Innes
|
949fd9ba97
|
loss function tweaks
|
2017-10-17 17:30:11 +01:00 |
|
Mike J Innes
|
f2052739c1
|
tweaks
|
2017-09-12 14:11:03 +01:00 |
|
Mike J Innes
|
9ce0439943
|
better mse
|
2017-08-24 11:40:51 +01:00 |
|
Mike J Innes
|
e4e9794f5e
|
loss function gradients
|
2017-08-23 17:50:43 +01:00 |
|
Mike J Innes
|
ef681f16ea
|
use nnlib for activations
|
2017-08-21 17:53:04 +01:00 |
|
Mike J Innes
|
18e69b33c9
|
forwarddiff does these
|
2017-08-19 22:05:50 +01:00 |
|
Mike J Innes
|
ad0e0ea5a7
|
explicitly broadcast sigmoid
|
2017-08-19 22:04:47 +01:00 |
|
Mike J Innes
|
4a9dc40e7c
|
simplify organisation
|
2017-08-19 20:52:29 +01:00 |
|