Picture for Arindam Khan

Arindam Khan

Mitigating Disparity while Maximizing Reward: Tight Anytime Guarantee for Improving Bandits

Add code
Aug 19, 2022
Figure 1 for Mitigating Disparity while Maximizing Reward: Tight Anytime Guarantee for Improving Bandits
Figure 2 for Mitigating Disparity while Maximizing Reward: Tight Anytime Guarantee for Improving Bandits
Figure 3 for Mitigating Disparity while Maximizing Reward: Tight Anytime Guarantee for Improving Bandits
Figure 4 for Mitigating Disparity while Maximizing Reward: Tight Anytime Guarantee for Improving Bandits
Viaarxiv icon

Fairness and Welfare Quantification for Regret in Multi-Armed Bandits

Add code
May 27, 2022
Viaarxiv icon

Approximation Algorithms for ROUND-UFP and ROUND-SAP

Add code
Feb 07, 2022
Figure 1 for Approximation Algorithms for ROUND-UFP and ROUND-SAP
Figure 2 for Approximation Algorithms for ROUND-UFP and ROUND-SAP
Figure 3 for Approximation Algorithms for ROUND-UFP and ROUND-SAP
Figure 4 for Approximation Algorithms for ROUND-UFP and ROUND-SAP
Viaarxiv icon

Streaming Algorithms for Stochastic Multi-armed Bandits

Add code
Dec 09, 2020
Figure 1 for Streaming Algorithms for Stochastic Multi-armed Bandits
Figure 2 for Streaming Algorithms for Stochastic Multi-armed Bandits
Viaarxiv icon