Picture for Namhoon Lee

Namhoon Lee

Rethinking Pruning Large Language Models: Benefits and Pitfalls of Reconstruction Error Minimization

Add code
Jun 21, 2024
Viaarxiv icon

The Effects of Overparameterization on Sharpness-aware Minimization: An Empirical and Theoretical Analysis

Add code
Nov 29, 2023
Viaarxiv icon

FedFwd: Federated Learning without Backpropagation

Add code
Sep 03, 2023
Viaarxiv icon

JaxPruner: A concise library for sparsity research

Add code
May 02, 2023
Viaarxiv icon

A Closer Look at the Intervention Procedure of Concept Bottleneck Models

Add code
Feb 28, 2023
Viaarxiv icon

MaskedKD: Efficient Distillation of Vision Transformers with Masked Images

Add code
Feb 21, 2023
Viaarxiv icon

SpReME: Sparse Regression for Multi-Environment Dynamic Systems

Add code
Feb 12, 2023
Viaarxiv icon

Meta-Learning Sparse Implicit Neural Representations

Add code
Nov 07, 2021
Figure 1 for Meta-Learning Sparse Implicit Neural Representations
Figure 2 for Meta-Learning Sparse Implicit Neural Representations
Figure 3 for Meta-Learning Sparse Implicit Neural Representations
Figure 4 for Meta-Learning Sparse Implicit Neural Representations
Viaarxiv icon

Data Parallelism in Training Sparse Neural Networks

Add code
Mar 25, 2020
Figure 1 for Data Parallelism in Training Sparse Neural Networks
Figure 2 for Data Parallelism in Training Sparse Neural Networks
Figure 3 for Data Parallelism in Training Sparse Neural Networks
Figure 4 for Data Parallelism in Training Sparse Neural Networks
Viaarxiv icon

A Signal Propagation Perspective for Pruning Neural Networks at Initialization

Add code
Jun 14, 2019
Figure 1 for A Signal Propagation Perspective for Pruning Neural Networks at Initialization
Figure 2 for A Signal Propagation Perspective for Pruning Neural Networks at Initialization
Figure 3 for A Signal Propagation Perspective for Pruning Neural Networks at Initialization
Figure 4 for A Signal Propagation Perspective for Pruning Neural Networks at Initialization
Viaarxiv icon