Picture for Aditya Mate

Aditya Mate

A resource-constrained stochastic scheduling algorithm for homeless street outreach and gleaning edible food

Add code
Mar 15, 2024
Viaarxiv icon

Improved Policy Evaluation for Randomized Trials of Algorithmic Resource Allocation

Add code
Feb 06, 2023
Viaarxiv icon

Decision-Focused Evaluation: Analyzing Performance of Deployed Restless Multi-Arm Bandits

Add code
Jan 19, 2023
Viaarxiv icon

Decision-Focused Learning in Restless Multi-Armed Bandits with Application to Maternal and Child Care Domain

Add code
Feb 02, 2022
Figure 1 for Decision-Focused Learning in Restless Multi-Armed Bandits with Application to Maternal and Child Care Domain
Figure 2 for Decision-Focused Learning in Restless Multi-Armed Bandits with Application to Maternal and Child Care Domain
Figure 3 for Decision-Focused Learning in Restless Multi-Armed Bandits with Application to Maternal and Child Care Domain
Figure 4 for Decision-Focused Learning in Restless Multi-Armed Bandits with Application to Maternal and Child Care Domain
Viaarxiv icon

Field Study in Deploying Restless Multi-Armed Bandits: Assisting Non-Profits in Improving Maternal and Child Health

Add code
Sep 16, 2021
Figure 1 for Field Study in Deploying Restless Multi-Armed Bandits: Assisting Non-Profits in Improving Maternal and Child Health
Figure 2 for Field Study in Deploying Restless Multi-Armed Bandits: Assisting Non-Profits in Improving Maternal and Child Health
Figure 3 for Field Study in Deploying Restless Multi-Armed Bandits: Assisting Non-Profits in Improving Maternal and Child Health
Figure 4 for Field Study in Deploying Restless Multi-Armed Bandits: Assisting Non-Profits in Improving Maternal and Child Health
Viaarxiv icon

Selective Intervention Planning using Restless Multi-Armed Bandits to Improve Maternal and Child Health Outcomes

Add code
Apr 05, 2021
Figure 1 for Selective Intervention Planning using Restless Multi-Armed Bandits to Improve Maternal and Child Health Outcomes
Figure 2 for Selective Intervention Planning using Restless Multi-Armed Bandits to Improve Maternal and Child Health Outcomes
Figure 3 for Selective Intervention Planning using Restless Multi-Armed Bandits to Improve Maternal and Child Health Outcomes
Figure 4 for Selective Intervention Planning using Restless Multi-Armed Bandits to Improve Maternal and Child Health Outcomes
Viaarxiv icon

Efficient Algorithms for Finite Horizon and Streaming Restless Multi-Armed Bandit Problems

Add code
Mar 08, 2021
Figure 1 for Efficient Algorithms for Finite Horizon and Streaming Restless Multi-Armed Bandit Problems
Figure 2 for Efficient Algorithms for Finite Horizon and Streaming Restless Multi-Armed Bandit Problems
Figure 3 for Efficient Algorithms for Finite Horizon and Streaming Restless Multi-Armed Bandit Problems
Figure 4 for Efficient Algorithms for Finite Horizon and Streaming Restless Multi-Armed Bandit Problems
Viaarxiv icon

Collapsing Bandits and Their Application to Public Health Interventions

Add code
Jul 05, 2020
Figure 1 for Collapsing Bandits and Their Application to Public Health Interventions
Figure 2 for Collapsing Bandits and Their Application to Public Health Interventions
Figure 3 for Collapsing Bandits and Their Application to Public Health Interventions
Figure 4 for Collapsing Bandits and Their Application to Public Health Interventions
Viaarxiv icon

Decision-Focused Learning of Adversary Behavior in Security Games

Add code
Mar 03, 2019
Figure 1 for Decision-Focused Learning of Adversary Behavior in Security Games
Figure 2 for Decision-Focused Learning of Adversary Behavior in Security Games
Figure 3 for Decision-Focused Learning of Adversary Behavior in Security Games
Figure 4 for Decision-Focused Learning of Adversary Behavior in Security Games
Viaarxiv icon