Picture for Jingbo Jiang

Jingbo Jiang

Tight Compression: Compressing CNN Through Fine-Grained Pruning and Weight Permutation for Efficient Implementation

Add code
Apr 03, 2021
Figure 1 for Tight Compression: Compressing CNN Through Fine-Grained Pruning and Weight Permutation for Efficient Implementation
Figure 2 for Tight Compression: Compressing CNN Through Fine-Grained Pruning and Weight Permutation for Efficient Implementation
Figure 3 for Tight Compression: Compressing CNN Through Fine-Grained Pruning and Weight Permutation for Efficient Implementation
Figure 4 for Tight Compression: Compressing CNN Through Fine-Grained Pruning and Weight Permutation for Efficient Implementation
Viaarxiv icon

A Reconfigurable Winograd CNN Accelerator with Nesting Decomposition Algorithm for Computing Convolution with Large Filters

Add code
Feb 26, 2021
Figure 1 for A Reconfigurable Winograd CNN Accelerator with Nesting Decomposition Algorithm for Computing Convolution with Large Filters
Figure 2 for A Reconfigurable Winograd CNN Accelerator with Nesting Decomposition Algorithm for Computing Convolution with Large Filters
Figure 3 for A Reconfigurable Winograd CNN Accelerator with Nesting Decomposition Algorithm for Computing Convolution with Large Filters
Figure 4 for A Reconfigurable Winograd CNN Accelerator with Nesting Decomposition Algorithm for Computing Convolution with Large Filters
Viaarxiv icon

A Comparison of the Taguchi Method and Evolutionary Optimization in Multivariate Testing

Add code
Aug 25, 2018
Figure 1 for A Comparison of the Taguchi Method and Evolutionary Optimization in Multivariate Testing
Figure 2 for A Comparison of the Taguchi Method and Evolutionary Optimization in Multivariate Testing
Figure 3 for A Comparison of the Taguchi Method and Evolutionary Optimization in Multivariate Testing
Figure 4 for A Comparison of the Taguchi Method and Evolutionary Optimization in Multivariate Testing
Viaarxiv icon

SparseNN: An Energy-Efficient Neural Network Accelerator Exploiting Input and Output Sparsity

Add code
Nov 03, 2017
Figure 1 for SparseNN: An Energy-Efficient Neural Network Accelerator Exploiting Input and Output Sparsity
Figure 2 for SparseNN: An Energy-Efficient Neural Network Accelerator Exploiting Input and Output Sparsity
Figure 3 for SparseNN: An Energy-Efficient Neural Network Accelerator Exploiting Input and Output Sparsity
Figure 4 for SparseNN: An Energy-Efficient Neural Network Accelerator Exploiting Input and Output Sparsity
Viaarxiv icon