Picture for Boqian Wu

Boqian Wu

Dynamic Sparse Training versus Dense Training: The Unexpected Winner in Image Corruption Robustness

Add code
Oct 03, 2024
Figure 1 for Dynamic Sparse Training versus Dense Training: The Unexpected Winner in Image Corruption Robustness
Figure 2 for Dynamic Sparse Training versus Dense Training: The Unexpected Winner in Image Corruption Robustness
Figure 3 for Dynamic Sparse Training versus Dense Training: The Unexpected Winner in Image Corruption Robustness
Figure 4 for Dynamic Sparse Training versus Dense Training: The Unexpected Winner in Image Corruption Robustness
Viaarxiv icon

Are Sparse Neural Networks Better Hard Sample Learners?

Add code
Sep 13, 2024
Figure 1 for Are Sparse Neural Networks Better Hard Sample Learners?
Figure 2 for Are Sparse Neural Networks Better Hard Sample Learners?
Figure 3 for Are Sparse Neural Networks Better Hard Sample Learners?
Figure 4 for Are Sparse Neural Networks Better Hard Sample Learners?
Viaarxiv icon

Dynamic Data Pruning for Automatic Speech Recognition

Add code
Jun 26, 2024
Viaarxiv icon

E2ENet: Dynamic Sparse Feature Fusion for Accurate and Efficient 3D Medical Image Segmentation

Add code
Dec 07, 2023
Figure 1 for E2ENet: Dynamic Sparse Feature Fusion for Accurate and Efficient 3D Medical Image Segmentation
Figure 2 for E2ENet: Dynamic Sparse Feature Fusion for Accurate and Efficient 3D Medical Image Segmentation
Figure 3 for E2ENet: Dynamic Sparse Feature Fusion for Accurate and Efficient 3D Medical Image Segmentation
Figure 4 for E2ENet: Dynamic Sparse Feature Fusion for Accurate and Efficient 3D Medical Image Segmentation
Viaarxiv icon

Dynamic Sparse Network for Time Series Classification: Learning What to "see''

Add code
Dec 19, 2022
Figure 1 for Dynamic Sparse Network for Time Series Classification: Learning What to "see''
Figure 2 for Dynamic Sparse Network for Time Series Classification: Learning What to "see''
Figure 3 for Dynamic Sparse Network for Time Series Classification: Learning What to "see''
Figure 4 for Dynamic Sparse Network for Time Series Classification: Learning What to "see''
Viaarxiv icon

More ConvNets in the 2020s: Scaling up Kernels Beyond 51x51 using Sparsity

Add code
Jul 07, 2022
Figure 1 for More ConvNets in the 2020s: Scaling up Kernels Beyond 51x51 using Sparsity
Figure 2 for More ConvNets in the 2020s: Scaling up Kernels Beyond 51x51 using Sparsity
Figure 3 for More ConvNets in the 2020s: Scaling up Kernels Beyond 51x51 using Sparsity
Figure 4 for More ConvNets in the 2020s: Scaling up Kernels Beyond 51x51 using Sparsity
Viaarxiv icon