Dithered backprop: A sparse and quantized backpropagation algorithm for more efficient deep neural network training

Add code
Apr 16, 2020
Figure 1 for Dithered backprop: A sparse and quantized backpropagation algorithm for more efficient deep neural network training
Figure 2 for Dithered backprop: A sparse and quantized backpropagation algorithm for more efficient deep neural network training
Figure 3 for Dithered backprop: A sparse and quantized backpropagation algorithm for more efficient deep neural network training
Figure 4 for Dithered backprop: A sparse and quantized backpropagation algorithm for more efficient deep neural network training

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: