Flux.jl/src
2016-10-30 01:18:20 +01:00
..
backend handle state in training 2016-10-30 00:24:29 +01:00
compiler consistent naming 2016-10-30 00:19:57 +01:00
dims BatchSeq convenience alias 2016-10-26 12:37:48 +01:00
layers this is no longer test code 2016-10-29 00:13:32 +01:00
activation.jl basic convnet example working 2016-09-06 18:11:15 +01:00
cost.jl revive basic train code 2016-09-06 18:10:20 +01:00
data.jl length for iterators 2016-10-30 01:18:20 +01:00
Flux.jl batched training for char-rnn 2016-10-29 23:36:39 +01:00
model.jl use param object rather than named input 2016-10-25 17:57:20 +01:00
utils.jl batched training for char-rnn 2016-10-29 23:36:39 +01:00