Picture for Alberto Delmas

Alberto Delmas

DPRed: Making Typical Activation Values Matter In Deep Learning Computing

Add code
May 15, 2018
Figure 1 for DPRed: Making Typical Activation Values Matter In Deep Learning Computing
Figure 2 for DPRed: Making Typical Activation Values Matter In Deep Learning Computing
Figure 3 for DPRed: Making Typical Activation Values Matter In Deep Learning Computing
Figure 4 for DPRed: Making Typical Activation Values Matter In Deep Learning Computing
Viaarxiv icon

Bit-Tactical: Exploiting Ineffectual Computations in Convolutional Neural Networks: Which, Why, and How

Add code
Mar 09, 2018
Figure 1 for Bit-Tactical: Exploiting Ineffectual Computations in Convolutional Neural Networks: Which, Why, and How
Figure 2 for Bit-Tactical: Exploiting Ineffectual Computations in Convolutional Neural Networks: Which, Why, and How
Figure 3 for Bit-Tactical: Exploiting Ineffectual Computations in Convolutional Neural Networks: Which, Why, and How
Figure 4 for Bit-Tactical: Exploiting Ineffectual Computations in Convolutional Neural Networks: Which, Why, and How
Viaarxiv icon

Tartan: Accelerating Fully-Connected and Convolutional Layers in Deep Learning Networks by Exploiting Numerical Precision Variability

Add code
Jul 27, 2017
Figure 1 for Tartan: Accelerating Fully-Connected and Convolutional Layers in Deep Learning Networks by Exploiting Numerical Precision Variability
Figure 2 for Tartan: Accelerating Fully-Connected and Convolutional Layers in Deep Learning Networks by Exploiting Numerical Precision Variability
Figure 3 for Tartan: Accelerating Fully-Connected and Convolutional Layers in Deep Learning Networks by Exploiting Numerical Precision Variability
Figure 4 for Tartan: Accelerating Fully-Connected and Convolutional Layers in Deep Learning Networks by Exploiting Numerical Precision Variability
Viaarxiv icon

Dynamic Stripes: Exploiting the Dynamic Precision Requirements of Activation Values in Neural Networks

Add code
Jun 01, 2017
Figure 1 for Dynamic Stripes: Exploiting the Dynamic Precision Requirements of Activation Values in Neural Networks
Figure 2 for Dynamic Stripes: Exploiting the Dynamic Precision Requirements of Activation Values in Neural Networks
Figure 3 for Dynamic Stripes: Exploiting the Dynamic Precision Requirements of Activation Values in Neural Networks
Figure 4 for Dynamic Stripes: Exploiting the Dynamic Precision Requirements of Activation Values in Neural Networks
Viaarxiv icon

Cnvlutin2: Ineffectual-Activation-and-Weight-Free Deep Neural Network Computing

Add code
Apr 29, 2017
Figure 1 for Cnvlutin2: Ineffectual-Activation-and-Weight-Free Deep Neural Network Computing
Figure 2 for Cnvlutin2: Ineffectual-Activation-and-Weight-Free Deep Neural Network Computing
Figure 3 for Cnvlutin2: Ineffectual-Activation-and-Weight-Free Deep Neural Network Computing
Figure 4 for Cnvlutin2: Ineffectual-Activation-and-Weight-Free Deep Neural Network Computing
Viaarxiv icon