Picture for Naonori Kakimura

Naonori Kakimura

New classes of the greedy-applicable arm feature distributions in the sparse linear bandit problem

Add code
Dec 19, 2023
Viaarxiv icon

Reforming an Envy-Free Matching

Add code
Jul 06, 2022
Figure 1 for Reforming an Envy-Free Matching
Figure 2 for Reforming an Envy-Free Matching
Figure 3 for Reforming an Envy-Free Matching
Figure 4 for Reforming an Envy-Free Matching
Viaarxiv icon

Online Task Assignment Problems with Reusable Resources

Add code
Mar 15, 2022
Figure 1 for Online Task Assignment Problems with Reusable Resources
Figure 2 for Online Task Assignment Problems with Reusable Resources
Figure 3 for Online Task Assignment Problems with Reusable Resources
Figure 4 for Online Task Assignment Problems with Reusable Resources
Viaarxiv icon

Near-Optimal Regret Bounds for Contextual Combinatorial Semi-Bandits with Linear Payoff Functions

Add code
Jan 20, 2021
Figure 1 for Near-Optimal Regret Bounds for Contextual Combinatorial Semi-Bandits with Linear Payoff Functions
Figure 2 for Near-Optimal Regret Bounds for Contextual Combinatorial Semi-Bandits with Linear Payoff Functions
Figure 3 for Near-Optimal Regret Bounds for Contextual Combinatorial Semi-Bandits with Linear Payoff Functions
Viaarxiv icon

Approximability of Monotone Submodular Function Maximization under Cardinality and Matroid Constraints in the Streaming Model

Add code
Feb 13, 2020
Figure 1 for Approximability of Monotone Submodular Function Maximization under Cardinality and Matroid Constraints in the Streaming Model
Figure 2 for Approximability of Monotone Submodular Function Maximization under Cardinality and Matroid Constraints in the Streaming Model
Figure 3 for Approximability of Monotone Submodular Function Maximization under Cardinality and Matroid Constraints in the Streaming Model
Figure 4 for Approximability of Monotone Submodular Function Maximization under Cardinality and Matroid Constraints in the Streaming Model
Viaarxiv icon

Causal Bandits with Propagating Inference

Add code
Jun 06, 2018
Figure 1 for Causal Bandits with Propagating Inference
Figure 2 for Causal Bandits with Propagating Inference
Viaarxiv icon