Picture for Esmaeil Keyvanshokooh

Esmaeil Keyvanshokooh

HR-Bandit: Human-AI Collaborated Linear Recourse Bandit

Add code
Oct 18, 2024
Viaarxiv icon

Online Uniform Risk Times Sampling: First Approximation Algorithms, Learning Augmentation with Full Confidence Interval Integration

Add code
Feb 07, 2024
Figure 1 for Online Uniform Risk Times Sampling: First Approximation Algorithms, Learning Augmentation with Full Confidence Interval Integration
Figure 2 for Online Uniform Risk Times Sampling: First Approximation Algorithms, Learning Augmentation with Full Confidence Interval Integration
Figure 3 for Online Uniform Risk Times Sampling: First Approximation Algorithms, Learning Augmentation with Full Confidence Interval Integration
Figure 4 for Online Uniform Risk Times Sampling: First Approximation Algorithms, Learning Augmentation with Full Confidence Interval Integration
Viaarxiv icon

Contextual Bandits with Budgeted Information Reveal

Add code
May 29, 2023
Figure 1 for Contextual Bandits with Budgeted Information Reveal
Figure 2 for Contextual Bandits with Budgeted Information Reveal
Viaarxiv icon