Picture for Vashist Avadhanula

Vashist Avadhanula

Fully Dynamic Online Selection through Online Contention Resolution Schemes

Add code
Jan 08, 2023
Viaarxiv icon

Bandits for Online Calibration: An Application to Content Moderation on Social Media Platforms

Add code
Nov 11, 2022
Figure 1 for Bandits for Online Calibration: An Application to Content Moderation on Social Media Platforms
Figure 2 for Bandits for Online Calibration: An Application to Content Moderation on Social Media Platforms
Figure 3 for Bandits for Online Calibration: An Application to Content Moderation on Social Media Platforms
Viaarxiv icon

Top $K$ Ranking for Multi-Armed Bandit with Noisy Evaluations

Add code
Dec 14, 2021
Figure 1 for Top $K$ Ranking for Multi-Armed Bandit with Noisy Evaluations
Figure 2 for Top $K$ Ranking for Multi-Armed Bandit with Noisy Evaluations
Figure 3 for Top $K$ Ranking for Multi-Armed Bandit with Noisy Evaluations
Figure 4 for Top $K$ Ranking for Multi-Armed Bandit with Noisy Evaluations
Viaarxiv icon

QUEST: Queue Simulation for Content Moderation at Scale

Add code
Mar 31, 2021
Figure 1 for QUEST: Queue Simulation for Content Moderation at Scale
Figure 2 for QUEST: Queue Simulation for Content Moderation at Scale
Figure 3 for QUEST: Queue Simulation for Content Moderation at Scale
Figure 4 for QUEST: Queue Simulation for Content Moderation at Scale
Viaarxiv icon

Stochastic Bandits for Multi-platform Budget Optimization in Online Advertising

Add code
Mar 25, 2021
Figure 1 for Stochastic Bandits for Multi-platform Budget Optimization in Online Advertising
Figure 2 for Stochastic Bandits for Multi-platform Budget Optimization in Online Advertising
Figure 3 for Stochastic Bandits for Multi-platform Budget Optimization in Online Advertising
Figure 4 for Stochastic Bandits for Multi-platform Budget Optimization in Online Advertising
Viaarxiv icon

Improved Optimistic Algorithm For The Multinomial Logit Contextual Bandit

Add code
Nov 28, 2020
Figure 1 for Improved Optimistic Algorithm For The Multinomial Logit Contextual Bandit
Viaarxiv icon

Multi-armed Bandits with Cost Subsidy

Add code
Nov 13, 2020
Figure 1 for Multi-armed Bandits with Cost Subsidy
Figure 2 for Multi-armed Bandits with Cost Subsidy
Figure 3 for Multi-armed Bandits with Cost Subsidy
Viaarxiv icon

Thompson Sampling for Contextual Bandit Problems with Auxiliary Safety Constraints

Add code
Nov 02, 2019
Figure 1 for Thompson Sampling for Contextual Bandit Problems with Auxiliary Safety Constraints
Figure 2 for Thompson Sampling for Contextual Bandit Problems with Auxiliary Safety Constraints
Figure 3 for Thompson Sampling for Contextual Bandit Problems with Auxiliary Safety Constraints
Viaarxiv icon

Thompson Sampling for the MNL-Bandit

Add code
Oct 31, 2018
Figure 1 for Thompson Sampling for the MNL-Bandit
Viaarxiv icon

MNL-Bandit: A Dynamic Learning Approach to Assortment Selection

Add code
Jun 29, 2018
Figure 1 for MNL-Bandit: A Dynamic Learning Approach to Assortment Selection
Figure 2 for MNL-Bandit: A Dynamic Learning Approach to Assortment Selection
Figure 3 for MNL-Bandit: A Dynamic Learning Approach to Assortment Selection
Viaarxiv icon