Flux.jl/src
2016-10-28 16:06:56 +01:00
..
backend better handling for reused params 2016-10-28 16:06:56 +01:00
compiler fix for matmul 2016-10-28 15:02:48 +01:00
dims BatchSeq convenience alias 2016-10-26 12:37:48 +01:00
layers refactor input model 2016-10-25 23:10:35 +01:00
activation.jl basic convnet example working 2016-09-06 18:11:15 +01:00
cost.jl revive basic train code 2016-09-06 18:10:20 +01:00
Flux.jl fix unrolling 2016-10-26 00:49:32 +01:00
model.jl use param object rather than named input 2016-10-25 17:57:20 +01:00
utils.jl use Float32 here 2016-10-25 16:23:04 +01:00