1023: Feature: Added Boston Housing Dataset r=CarloLucibello a=pranjaldatta
[Boston Housing Dataset](https://archive.ics.uci.edu/ml/machine-learning-databases/housing/) is one of the most common datasets that are used by beginners. It is as popular as other datasets like Iris etc. Hence, it feels only natural that this dataset is a part of Flux.
Added src/data/housing.jl: code for downloading and loading the dataset
Edited src/data/Data.jl: To include and export housing.jl
Edited test/data.jl: Added test for the dataset.
*All tests in test/data.jl are passing*
Co-authored-by: pranjaldatta <pranjaldatta99@gmail.com>
Co-authored-by: Pranjal Datta <pranjaldatta99@gmail.com>
998: test restructure on the GPU r=CarloLucibello a=ChrisRackauckas
Requires https://github.com/FluxML/Zygote.jl/pull/474 to pass
Co-authored-by: Chris Rackauckas <accounts@chrisrackauckas.com>
960: Added utility function outdims to compute output dimensions of a layer r=dhairyagandhi96 a=darsnack
Based on Slack chatter, I added a utility function, `outdims`, that computes the output dimensions for given input dimensions.
Example
```julia
layer = Conv((3, 3), 3 => 16)
outdims(layer, (10, 10)) # returns (8, 8)
```
Co-authored-by: Kyle Daruwalla <daruwalla@wisc.edu>
680: Added new loss functions. r=thebhatman a=thebhatman
I have added the KL Divergence Loss function, Poisson loss function, Logcosh loss, and Hinge loss function.
Co-authored-by: Manjunath Bhat <manjunathbhat9920@gmail.com>
Co-authored-by: thebhatman <manjunathbhat9920@gmail.com>
937: Fix Glorot initialization, add He initialization r=MikeInnes a=Sleort
Should fix#442 .
Adds He weight initialization as a bonus :-)
Co-authored-by: Troels Arnfred Bojesen <tr-ab@online.no>
932: Travis: test on 1.0 r=MikeInnes a=MikeInnes
Co-authored-by: Mike J Innes <mike.j.innes@gmail.com>
Co-authored-by: Mike Innes <mike.j.innes@gmail.com>
865: Functor r=MikeInnes a=MikeInnes
This refactors our current `@treelike` infrastructure. It somewhat formalises what we're doing around the idea of a Flux model as a functor, i.e. something that can be mapped over.
This is much more flexible than what we had before, and avoids some issues. It allows layers to have state that isn't mappable; it allows for dispatch when walking the tree, which means layers like `BatchNorm` can have non-trainable parameters; and it also allows for zipped mapping like `fmap(+, xs, ys)`, which isn't implemented yet but will be useful for the new optimisers work.
The main downside is that the term `functor` has been previously used in the Julia community as a malapropism for "thing that behaves like a function"; but hopefully this can start to reduce that usage.
Co-authored-by: Mike Innes <mike.j.innes@gmail.com>