Picture for Yongliang Shen

Yongliang Shen

InftyThink: Breaking the Length Limits of Long-Context Reasoning in Large Language Models

Add code
Mar 09, 2025
Viaarxiv icon

Think Twice, Click Once: Enhancing GUI Grounding via Fast and Slow Systems

Add code
Mar 09, 2025
Viaarxiv icon

DB-Explore: Automated Database Exploration and Instruction Synthesis for Text-to-SQL

Add code
Mar 06, 2025
Viaarxiv icon

AskToAct: Enhancing LLMs Tool Use via Self-Correcting Clarification

Add code
Mar 03, 2025
Viaarxiv icon

STaR-SQL: Self-Taught Reasoner for Text-to-SQL

Add code
Feb 19, 2025
Viaarxiv icon

MathFimer: Enhancing Mathematical Reasoning by Expanding Reasoning Steps through Fill-in-the-Middle Task

Add code
Feb 17, 2025
Viaarxiv icon

2.5 Years in Class: A Multimodal Textbook for Vision-Language Pretraining

Add code
Jan 03, 2025
Figure 1 for 2.5 Years in Class: A Multimodal Textbook for Vision-Language Pretraining
Figure 2 for 2.5 Years in Class: A Multimodal Textbook for Vision-Language Pretraining
Figure 3 for 2.5 Years in Class: A Multimodal Textbook for Vision-Language Pretraining
Figure 4 for 2.5 Years in Class: A Multimodal Textbook for Vision-Language Pretraining
Viaarxiv icon

MAKIMA: Tuning-free Multi-Attribute Open-domain Video Editing via Mask-Guided Attention Modulation

Add code
Dec 28, 2024
Figure 1 for MAKIMA: Tuning-free Multi-Attribute Open-domain Video Editing via Mask-Guided Attention Modulation
Figure 2 for MAKIMA: Tuning-free Multi-Attribute Open-domain Video Editing via Mask-Guided Attention Modulation
Figure 3 for MAKIMA: Tuning-free Multi-Attribute Open-domain Video Editing via Mask-Guided Attention Modulation
Figure 4 for MAKIMA: Tuning-free Multi-Attribute Open-domain Video Editing via Mask-Guided Attention Modulation
Viaarxiv icon

GaVaMoE: Gaussian-Variational Gated Mixture of Experts for Explainable Recommendation

Add code
Oct 15, 2024
Figure 1 for GaVaMoE: Gaussian-Variational Gated Mixture of Experts for Explainable Recommendation
Figure 2 for GaVaMoE: Gaussian-Variational Gated Mixture of Experts for Explainable Recommendation
Figure 3 for GaVaMoE: Gaussian-Variational Gated Mixture of Experts for Explainable Recommendation
Figure 4 for GaVaMoE: Gaussian-Variational Gated Mixture of Experts for Explainable Recommendation
Viaarxiv icon

Entering Real Social World! Benchmarking the Theory of Mind and Socialization Capabilities of LLMs from a First-person Perspective

Add code
Oct 08, 2024
Viaarxiv icon