Picture for Joey Tianyi Zhou

Joey Tianyi Zhou

DiffPO: Diffusion-styled Preference Optimization for Efficient Inference-Time Alignment of Large Language Models

Add code
Mar 06, 2025
Viaarxiv icon

LLM Knows Geometry Better than Algebra: Numerical Understanding of LLM-Based Agents in A Trading Arena

Add code
Feb 25, 2025
Viaarxiv icon

Dark Distillation: Backdooring Distilled Datasets without Accessing Raw Data

Add code
Feb 06, 2025
Viaarxiv icon

KPL: Training-Free Medical Knowledge Mining of Vision-Language Models

Add code
Jan 20, 2025
Figure 1 for KPL: Training-Free Medical Knowledge Mining of Vision-Language Models
Figure 2 for KPL: Training-Free Medical Knowledge Mining of Vision-Language Models
Figure 3 for KPL: Training-Free Medical Knowledge Mining of Vision-Language Models
Figure 4 for KPL: Training-Free Medical Knowledge Mining of Vision-Language Models
Viaarxiv icon

MedCoT: Medical Chain of Thought via Hierarchical Expert

Add code
Dec 18, 2024
Viaarxiv icon

PVP: Polar Representation Boost for 3D Semantic Occupancy Prediction

Add code
Dec 10, 2024
Viaarxiv icon

Video Set Distillation: Information Diversification and Temporal Densification

Add code
Nov 28, 2024
Figure 1 for Video Set Distillation: Information Diversification and Temporal Densification
Figure 2 for Video Set Distillation: Information Diversification and Temporal Densification
Figure 3 for Video Set Distillation: Information Diversification and Temporal Densification
Figure 4 for Video Set Distillation: Information Diversification and Temporal Densification
Viaarxiv icon

The Best of Both Worlds: On the Dilemma of Out-of-distribution Detection

Add code
Oct 12, 2024
Viaarxiv icon

Diversity-Driven Synthesis: Enhancing Dataset Distillation through Directed Weight Adjustment

Add code
Sep 26, 2024
Viaarxiv icon

Breaking Class Barriers: Efficient Dataset Distillation via Inter-Class Feature Compensator

Add code
Aug 13, 2024
Figure 1 for Breaking Class Barriers: Efficient Dataset Distillation via Inter-Class Feature Compensator
Figure 2 for Breaking Class Barriers: Efficient Dataset Distillation via Inter-Class Feature Compensator
Figure 3 for Breaking Class Barriers: Efficient Dataset Distillation via Inter-Class Feature Compensator
Figure 4 for Breaking Class Barriers: Efficient Dataset Distillation via Inter-Class Feature Compensator
Viaarxiv icon