Picture for Pierre Perrault

Pierre Perrault

Limitations of (Procrustes) Alignment in Assessing Multi-Person Human Pose and Shape Estimation

Add code
Sep 25, 2024
Viaarxiv icon

Model-free Posterior Sampling via Learning Rate Randomization

Add code
Oct 27, 2023
Viaarxiv icon

Demonstration-Regularized RL

Add code
Oct 26, 2023
Viaarxiv icon

Fast Rates for Maximum Entropy Exploration

Add code
Mar 14, 2023
Figure 1 for Fast Rates for Maximum Entropy Exploration
Figure 2 for Fast Rates for Maximum Entropy Exploration
Figure 3 for Fast Rates for Maximum Entropy Exploration
Figure 4 for Fast Rates for Maximum Entropy Exploration
Viaarxiv icon

When Combinatorial Thompson Sampling meets Approximation Regret

Add code
Feb 22, 2023
Viaarxiv icon

Statistical Efficiency of Thompson Sampling for Combinatorial Semi-Bandits

Add code
Jun 11, 2020
Figure 1 for Statistical Efficiency of Thompson Sampling for Combinatorial Semi-Bandits
Figure 2 for Statistical Efficiency of Thompson Sampling for Combinatorial Semi-Bandits
Figure 3 for Statistical Efficiency of Thompson Sampling for Combinatorial Semi-Bandits
Figure 4 for Statistical Efficiency of Thompson Sampling for Combinatorial Semi-Bandits
Viaarxiv icon

Active Linear Regression

Add code
Jun 20, 2019
Figure 1 for Active Linear Regression
Figure 2 for Active Linear Regression
Figure 3 for Active Linear Regression
Figure 4 for Active Linear Regression
Viaarxiv icon

Exploiting Structure of Uncertainty for Efficient Combinatorial Semi-Bandits

Add code
Feb 11, 2019
Figure 1 for Exploiting Structure of Uncertainty for Efficient Combinatorial Semi-Bandits
Figure 2 for Exploiting Structure of Uncertainty for Efficient Combinatorial Semi-Bandits
Figure 3 for Exploiting Structure of Uncertainty for Efficient Combinatorial Semi-Bandits
Figure 4 for Exploiting Structure of Uncertainty for Efficient Combinatorial Semi-Bandits
Viaarxiv icon

Finding the Bandit in a Graph: Sequential Search-and-Stop

Add code
Oct 10, 2018
Figure 1 for Finding the Bandit in a Graph: Sequential Search-and-Stop
Figure 2 for Finding the Bandit in a Graph: Sequential Search-and-Stop
Figure 3 for Finding the Bandit in a Graph: Sequential Search-and-Stop
Viaarxiv icon