Update docs/src/performance.md
Co-Authored-By: Kristoffer Carlsson <kristoffer.carlsson@chalmers.se>
This commit is contained in:
parent
fc4827c48f
commit
fe759ac43c
|
@ -26,7 +26,7 @@ A very artificial example using an activation function like
|
|||
|
||||
will result in performance on `Float32` input orders of magnitude slower than the normal `tanh` would,
|
||||
because it results in having to use slow mixed type multiplication in the dense layers.
|
||||
Similar can occur in the loss function during backpropagation.
|
||||
Similar situations can occur in the loss function during backpropagation.
|
||||
|
||||
Which means if you change your data say from `Float64` to `Float32` (which should give a speedup: see above),
|
||||
you will see a large slow-down
|
||||
|
|
Loading…
Reference in New Issue