Picture for Thibault Formal

Thibault Formal

Provence: efficient and robust context pruning for retrieval-augmented generation

Add code
Jan 27, 2025
Viaarxiv icon

Retrieval-augmented generation in multilingual settings

Add code
Jul 01, 2024
Viaarxiv icon

BERGEN: A Benchmarking Library for Retrieval-Augmented Generation

Add code
Jul 01, 2024
Viaarxiv icon

SPLATE: Sparse Late Interaction Retrieval

Add code
Apr 22, 2024
Viaarxiv icon

A Thorough Comparison of Cross-Encoders and LLMs for Reranking SPLADE

Add code
Mar 15, 2024
Viaarxiv icon

SPLADE-v3: New baselines for SPLADE

Add code
Mar 11, 2024
Figure 1 for SPLADE-v3: New baselines for SPLADE
Figure 2 for SPLADE-v3: New baselines for SPLADE
Figure 3 for SPLADE-v3: New baselines for SPLADE
Figure 4 for SPLADE-v3: New baselines for SPLADE
Viaarxiv icon

Benchmarking Middle-Trained Language Models for Neural Search

Add code
Jun 05, 2023
Viaarxiv icon

Query Performance Prediction for Neural IR: Are We There Yet?

Add code
Feb 20, 2023
Viaarxiv icon

CoSPLADE: Contextualizing SPLADE for Conversational Information Retrieval

Add code
Jan 11, 2023
Figure 1 for CoSPLADE: Contextualizing SPLADE for Conversational Information Retrieval
Figure 2 for CoSPLADE: Contextualizing SPLADE for Conversational Information Retrieval
Figure 3 for CoSPLADE: Contextualizing SPLADE for Conversational Information Retrieval
Viaarxiv icon

From Distillation to Hard Negative Sampling: Making Sparse Neural IR Models More Effective

Add code
May 12, 2022
Figure 1 for From Distillation to Hard Negative Sampling: Making Sparse Neural IR Models More Effective
Figure 2 for From Distillation to Hard Negative Sampling: Making Sparse Neural IR Models More Effective
Figure 3 for From Distillation to Hard Negative Sampling: Making Sparse Neural IR Models More Effective
Figure 4 for From Distillation to Hard Negative Sampling: Making Sparse Neural IR Models More Effective
Viaarxiv icon