Picture for Xuran Meng

Xuran Meng

Initialization Matters: On the Benign Overfitting of Two-Layer ReLU CNN with Fully Trainable Layers

Add code
Oct 24, 2024
Viaarxiv icon

Benign Overfitting in Two-Layer ReLU Convolutional Neural Networks for XOR Data

Add code
Oct 03, 2023
Viaarxiv icon

Per-Example Gradient Regularization Improves Learning Signals from Noisy Data

Add code
Mar 31, 2023
Viaarxiv icon

Multiple Descent in the Multiple Random Feature Model

Add code
Aug 21, 2022
Figure 1 for Multiple Descent in the Multiple Random Feature Model
Figure 2 for Multiple Descent in the Multiple Random Feature Model
Figure 3 for Multiple Descent in the Multiple Random Feature Model
Figure 4 for Multiple Descent in the Multiple Random Feature Model
Viaarxiv icon

Implicit Data-Driven Regularization in Deep Neural Networks under SGD

Add code
Nov 26, 2021
Figure 1 for Implicit Data-Driven Regularization in Deep Neural Networks under SGD
Figure 2 for Implicit Data-Driven Regularization in Deep Neural Networks under SGD
Figure 3 for Implicit Data-Driven Regularization in Deep Neural Networks under SGD
Figure 4 for Implicit Data-Driven Regularization in Deep Neural Networks under SGD
Viaarxiv icon