Picture for Yingyan

Yingyan

Celine

From Inference Efficiency to Embodied Efficiency: Revisiting Efficiency Metrics for Vision-Language-Action Models

Add code
Mar 19, 2026
Viaarxiv icon

Report for NSF Workshop on AI for Electronic Design Automation

Add code
Jan 20, 2026
Viaarxiv icon

Fewer Denoising Steps or Cheaper Per-Step Inference: Towards Compute-Optimal Diffusion Model Deployment

Add code
Aug 08, 2025
Figure 1 for Fewer Denoising Steps or Cheaper Per-Step Inference: Towards Compute-Optimal Diffusion Model Deployment
Figure 2 for Fewer Denoising Steps or Cheaper Per-Step Inference: Towards Compute-Optimal Diffusion Model Deployment
Figure 3 for Fewer Denoising Steps or Cheaper Per-Step Inference: Towards Compute-Optimal Diffusion Model Deployment
Figure 4 for Fewer Denoising Steps or Cheaper Per-Step Inference: Towards Compute-Optimal Diffusion Model Deployment
Viaarxiv icon

CLIMB: CLustering-based Iterative Data Mixture Bootstrapping for Language Model Pre-training

Add code
Apr 17, 2025
Figure 1 for CLIMB: CLustering-based Iterative Data Mixture Bootstrapping for Language Model Pre-training
Figure 2 for CLIMB: CLustering-based Iterative Data Mixture Bootstrapping for Language Model Pre-training
Figure 3 for CLIMB: CLustering-based Iterative Data Mixture Bootstrapping for Language Model Pre-training
Figure 4 for CLIMB: CLustering-based Iterative Data Mixture Bootstrapping for Language Model Pre-training
Viaarxiv icon

Early-Bird Diffusion: Investigating and Leveraging Timestep-Aware Early-Bird Tickets in Diffusion Models for Efficient Training

Add code
Apr 13, 2025
Figure 1 for Early-Bird Diffusion: Investigating and Leveraging Timestep-Aware Early-Bird Tickets in Diffusion Models for Efficient Training
Figure 2 for Early-Bird Diffusion: Investigating and Leveraging Timestep-Aware Early-Bird Tickets in Diffusion Models for Efficient Training
Figure 3 for Early-Bird Diffusion: Investigating and Leveraging Timestep-Aware Early-Bird Tickets in Diffusion Models for Efficient Training
Figure 4 for Early-Bird Diffusion: Investigating and Leveraging Timestep-Aware Early-Bird Tickets in Diffusion Models for Efficient Training
Viaarxiv icon

MG-Verilog: Multi-grained Dataset Towards Enhanced LLM-assisted Verilog Generation

Add code
Jul 02, 2024
Figure 1 for MG-Verilog: Multi-grained Dataset Towards Enhanced LLM-assisted Verilog Generation
Figure 2 for MG-Verilog: Multi-grained Dataset Towards Enhanced LLM-assisted Verilog Generation
Figure 3 for MG-Verilog: Multi-grained Dataset Towards Enhanced LLM-assisted Verilog Generation
Figure 4 for MG-Verilog: Multi-grained Dataset Towards Enhanced LLM-assisted Verilog Generation
Viaarxiv icon

ShiftAddLLM: Accelerating Pretrained LLMs via Post-Training Multiplication-Less Reparameterization

Add code
Jun 11, 2024
Figure 1 for ShiftAddLLM: Accelerating Pretrained LLMs via Post-Training Multiplication-Less Reparameterization
Figure 2 for ShiftAddLLM: Accelerating Pretrained LLMs via Post-Training Multiplication-Less Reparameterization
Figure 3 for ShiftAddLLM: Accelerating Pretrained LLMs via Post-Training Multiplication-Less Reparameterization
Figure 4 for ShiftAddLLM: Accelerating Pretrained LLMs via Post-Training Multiplication-Less Reparameterization
Viaarxiv icon

When Linear Attention Meets Autoregressive Decoding: Towards More Effective and Efficient Linearized Large Language Models

Add code
Jun 11, 2024
Viaarxiv icon

MixRT: Mixed Neural Representations For Real-Time NeRF Rendering

Add code
Dec 20, 2023
Figure 1 for MixRT: Mixed Neural Representations For Real-Time NeRF Rendering
Figure 2 for MixRT: Mixed Neural Representations For Real-Time NeRF Rendering
Figure 3 for MixRT: Mixed Neural Representations For Real-Time NeRF Rendering
Figure 4 for MixRT: Mixed Neural Representations For Real-Time NeRF Rendering
Viaarxiv icon

NetDistiller: Empowering Tiny Deep Learning via In-Situ Distillation

Add code
Oct 24, 2023
Figure 1 for NetDistiller: Empowering Tiny Deep Learning via In-Situ Distillation
Figure 2 for NetDistiller: Empowering Tiny Deep Learning via In-Situ Distillation
Figure 3 for NetDistiller: Empowering Tiny Deep Learning via In-Situ Distillation
Figure 4 for NetDistiller: Empowering Tiny Deep Learning via In-Situ Distillation
Viaarxiv icon