Picture for Yuhang Zhou

Yuhang Zhou

DivIL: Unveiling and Addressing Over-Invariance for Out-of- Distribution Generalization

Add code
Feb 18, 2025
Viaarxiv icon

MergeME: Model Merging Techniques for Homogeneous and Heterogeneous MoEs

Add code
Feb 04, 2025
Figure 1 for MergeME: Model Merging Techniques for Homogeneous and Heterogeneous MoEs
Figure 2 for MergeME: Model Merging Techniques for Homogeneous and Heterogeneous MoEs
Figure 3 for MergeME: Model Merging Techniques for Homogeneous and Heterogeneous MoEs
Figure 4 for MergeME: Model Merging Techniques for Homogeneous and Heterogeneous MoEs
Viaarxiv icon

Continual Task Learning through Adaptive Policy Self-Composition

Add code
Nov 18, 2024
Figure 1 for Continual Task Learning through Adaptive Policy Self-Composition
Figure 2 for Continual Task Learning through Adaptive Policy Self-Composition
Figure 3 for Continual Task Learning through Adaptive Policy Self-Composition
Figure 4 for Continual Task Learning through Adaptive Policy Self-Composition
Viaarxiv icon

Task-Aware Harmony Multi-Task Decision Transformer for Offline Reinforcement Learning

Add code
Nov 02, 2024
Viaarxiv icon

Revisiting SLO and Goodput Metrics in LLM Serving

Add code
Oct 18, 2024
Viaarxiv icon

LoRKD: Low-Rank Knowledge Decomposition for Medical Foundation Models

Add code
Sep 29, 2024
Viaarxiv icon

Reprogramming Distillation for Medical Foundation Models

Add code
Jul 09, 2024
Viaarxiv icon

Multimodal Graph Benchmark

Add code
Jun 24, 2024
Figure 1 for Multimodal Graph Benchmark
Figure 2 for Multimodal Graph Benchmark
Figure 3 for Multimodal Graph Benchmark
Figure 4 for Multimodal Graph Benchmark
Viaarxiv icon

Multi-Stage Balanced Distillation: Addressing Long-Tail Challenges in Sequence-Level Knowledge Distillation

Add code
Jun 19, 2024
Figure 1 for Multi-Stage Balanced Distillation: Addressing Long-Tail Challenges in Sequence-Level Knowledge Distillation
Figure 2 for Multi-Stage Balanced Distillation: Addressing Long-Tail Challenges in Sequence-Level Knowledge Distillation
Figure 3 for Multi-Stage Balanced Distillation: Addressing Long-Tail Challenges in Sequence-Level Knowledge Distillation
Figure 4 for Multi-Stage Balanced Distillation: Addressing Long-Tail Challenges in Sequence-Level Knowledge Distillation
Viaarxiv icon

Exploring Training on Heterogeneous Data with Mixture of Low-rank Adapters

Add code
Jun 14, 2024
Viaarxiv icon