Picture for Jie Peng

Jie Peng

Optimal Complexity in Byzantine-Robust Distributed Stochastic Optimization with Data Heterogeneity

Add code
Mar 20, 2025
Viaarxiv icon

GRNFormer: A Biologically-Guided Framework for Integrating Gene Regulatory Networks into RNA Foundation Models

Add code
Mar 03, 2025
Viaarxiv icon

Symbiotic Cooperation for Web Agents: Harnessing Complementary Strengths of Large and Small LLMs

Add code
Feb 11, 2025
Viaarxiv icon

TGB-Seq Benchmark: Challenging Temporal GNNs with Complex Sequential Dynamics

Add code
Feb 05, 2025
Viaarxiv icon

Continually Evolved Multimodal Foundation Models for Cancer Prognosis

Add code
Jan 30, 2025
Viaarxiv icon

Dialogue is Better Than Monologue: Instructing Medical LLMs via Strategical Conversations

Add code
Jan 29, 2025
Viaarxiv icon

Layer-Level Self-Exposure and Patch: Affirmative Token Mitigation for Jailbreak Attack Defense

Add code
Jan 05, 2025
Viaarxiv icon

Harnessing Your DRAM and SSD for Sustainable and Accessible LLM Inference with Mixed-Precision and Multi-level Caching

Add code
Oct 23, 2024
Figure 1 for Harnessing Your DRAM and SSD for Sustainable and Accessible LLM Inference with Mixed-Precision and Multi-level Caching
Figure 2 for Harnessing Your DRAM and SSD for Sustainable and Accessible LLM Inference with Mixed-Precision and Multi-level Caching
Figure 3 for Harnessing Your DRAM and SSD for Sustainable and Accessible LLM Inference with Mixed-Precision and Multi-level Caching
Figure 4 for Harnessing Your DRAM and SSD for Sustainable and Accessible LLM Inference with Mixed-Precision and Multi-level Caching
Viaarxiv icon

Flex-MoE: Modeling Arbitrary Modality Combination via the Flexible Mixture-of-Experts

Add code
Oct 10, 2024
Viaarxiv icon

Glider: Global and Local Instruction-Driven Expert Router

Add code
Oct 09, 2024
Viaarxiv icon