Picture for Ganesh Ghalme

Ganesh Ghalme

Simultaneously Achieving Group Exposure Fairness and Within-Group Meritocracy in Stochastic Bandits

Add code
Feb 08, 2024
Viaarxiv icon

Mitigating Disparity while Maximizing Reward: Tight Anytime Guarantee for Improving Bandits

Add code
Aug 19, 2022
Figure 1 for Mitigating Disparity while Maximizing Reward: Tight Anytime Guarantee for Improving Bandits
Figure 2 for Mitigating Disparity while Maximizing Reward: Tight Anytime Guarantee for Improving Bandits
Figure 3 for Mitigating Disparity while Maximizing Reward: Tight Anytime Guarantee for Improving Bandits
Figure 4 for Mitigating Disparity while Maximizing Reward: Tight Anytime Guarantee for Improving Bandits
Viaarxiv icon

Strategic Representation

Add code
Jun 17, 2022
Figure 1 for Strategic Representation
Viaarxiv icon

Efficient Algorithms For Fair Clustering with a New Fairness Notion

Add code
Sep 03, 2021
Figure 1 for Efficient Algorithms For Fair Clustering with a New Fairness Notion
Figure 2 for Efficient Algorithms For Fair Clustering with a New Fairness Notion
Figure 3 for Efficient Algorithms For Fair Clustering with a New Fairness Notion
Figure 4 for Efficient Algorithms For Fair Clustering with a New Fairness Notion
Viaarxiv icon

Sleeping Combinatorial Bandits

Add code
Jun 03, 2021
Figure 1 for Sleeping Combinatorial Bandits
Figure 2 for Sleeping Combinatorial Bandits
Viaarxiv icon

Strategic Classification in the Dark

Add code
Mar 06, 2021
Figure 1 for Strategic Classification in the Dark
Figure 2 for Strategic Classification in the Dark
Figure 3 for Strategic Classification in the Dark
Figure 4 for Strategic Classification in the Dark
Viaarxiv icon

State-Visitation Fairness in Average-Reward MDPs

Add code
Mar 02, 2021
Figure 1 for State-Visitation Fairness in Average-Reward MDPs
Figure 2 for State-Visitation Fairness in Average-Reward MDPs
Figure 3 for State-Visitation Fairness in Average-Reward MDPs
Figure 4 for State-Visitation Fairness in Average-Reward MDPs
Viaarxiv icon

Ballooning Multi-Armed Bandits

Add code
Jan 24, 2020
Figure 1 for Ballooning Multi-Armed Bandits
Viaarxiv icon

Achieving Fairness in the Stochastic Multi-armed Bandit Problem

Add code
Jul 23, 2019
Figure 1 for Achieving Fairness in the Stochastic Multi-armed Bandit Problem
Figure 2 for Achieving Fairness in the Stochastic Multi-armed Bandit Problem
Figure 3 for Achieving Fairness in the Stochastic Multi-armed Bandit Problem
Viaarxiv icon

Stochastic Multi-armed Bandits with Arm-specific Fairness Guarantees

Add code
May 27, 2019
Figure 1 for Stochastic Multi-armed Bandits with Arm-specific Fairness Guarantees
Figure 2 for Stochastic Multi-armed Bandits with Arm-specific Fairness Guarantees
Figure 3 for Stochastic Multi-armed Bandits with Arm-specific Fairness Guarantees
Viaarxiv icon