Picture for Caiming Xiong

Caiming Xiong

Salesforce AI Research

Spider 2.0: Evaluating Language Models on Real-World Enterprise Text-to-SQL Workflows

Add code
Nov 12, 2024
Viaarxiv icon

BLIP3-KALE: Knowledge Augmented Large-Scale Dense Captions

Add code
Nov 12, 2024
Viaarxiv icon

CodeTree: Agent-guided Tree Search for Code Generation with Large Language Models

Add code
Nov 07, 2024
Figure 1 for CodeTree: Agent-guided Tree Search for Code Generation with Large Language Models
Figure 2 for CodeTree: Agent-guided Tree Search for Code Generation with Large Language Models
Figure 3 for CodeTree: Agent-guided Tree Search for Code Generation with Large Language Models
Figure 4 for CodeTree: Agent-guided Tree Search for Code Generation with Large Language Models
Viaarxiv icon

Language Models are Hidden Reasoners: Unlocking Latent Reasoning Capabilities via Self-Rewarding

Add code
Nov 06, 2024
Viaarxiv icon

CRMArena: Understanding the Capacity of LLM Agents to Perform Professional CRM Tasks in Realistic Environments

Add code
Nov 04, 2024
Viaarxiv icon

JudgeRank: Leveraging Large Language Models for Reasoning-Intensive Reranking

Add code
Oct 31, 2024
Viaarxiv icon

Asynchronous Tool Usage for Real-Time Agents

Add code
Oct 28, 2024
Viaarxiv icon

PRACT: Optimizing Principled Reasoning and Acting of LLM Agent

Add code
Oct 24, 2024
Figure 1 for PRACT: Optimizing Principled Reasoning and Acting of LLM Agent
Figure 2 for PRACT: Optimizing Principled Reasoning and Acting of LLM Agent
Figure 3 for PRACT: Optimizing Principled Reasoning and Acting of LLM Agent
Figure 4 for PRACT: Optimizing Principled Reasoning and Acting of LLM Agent
Viaarxiv icon

Distill-SynthKG: Distilling Knowledge Graph Synthesis Workflow for Improved Coverage and Efficiency

Add code
Oct 22, 2024
Figure 1 for Distill-SynthKG: Distilling Knowledge Graph Synthesis Workflow for Improved Coverage and Efficiency
Figure 2 for Distill-SynthKG: Distilling Knowledge Graph Synthesis Workflow for Improved Coverage and Efficiency
Figure 3 for Distill-SynthKG: Distilling Knowledge Graph Synthesis Workflow for Improved Coverage and Efficiency
Figure 4 for Distill-SynthKG: Distilling Knowledge Graph Synthesis Workflow for Improved Coverage and Efficiency
Viaarxiv icon

xGen-MM-Vid (BLIP-3-Video): You Only Need 32 Tokens to Represent a Video Even in VLMs

Add code
Oct 21, 2024
Viaarxiv icon