Picture for Lin Ge

Lin Ge

Multi-Task Combinatorial Bandits for Budget Allocation

Add code
Aug 31, 2024
Figure 1 for Multi-Task Combinatorial Bandits for Budget Allocation
Figure 2 for Multi-Task Combinatorial Bandits for Budget Allocation
Figure 3 for Multi-Task Combinatorial Bandits for Budget Allocation
Figure 4 for Multi-Task Combinatorial Bandits for Budget Allocation
Viaarxiv icon

Large Language Model for Causal Decision Making

Add code
Dec 29, 2023
Viaarxiv icon

A Reinforcement Learning Framework for Dynamic Mediation Analysis

Add code
Jan 31, 2023
Viaarxiv icon

Towards Scalable and Robust Structured Bandits: A Meta-Learning Framework

Add code
Feb 26, 2022
Figure 1 for Towards Scalable and Robust Structured Bandits: A Meta-Learning Framework
Figure 2 for Towards Scalable and Robust Structured Bandits: A Meta-Learning Framework
Figure 3 for Towards Scalable and Robust Structured Bandits: A Meta-Learning Framework
Figure 4 for Towards Scalable and Robust Structured Bandits: A Meta-Learning Framework
Viaarxiv icon

Exploratory Hidden Markov Factor Models for Longitudinal Mobile Health Data: Application to Adverse Posttraumatic Neuropsychiatric Sequelae

Add code
Feb 25, 2022
Figure 1 for Exploratory Hidden Markov Factor Models for Longitudinal Mobile Health Data: Application to Adverse Posttraumatic Neuropsychiatric Sequelae
Figure 2 for Exploratory Hidden Markov Factor Models for Longitudinal Mobile Health Data: Application to Adverse Posttraumatic Neuropsychiatric Sequelae
Figure 3 for Exploratory Hidden Markov Factor Models for Longitudinal Mobile Health Data: Application to Adverse Posttraumatic Neuropsychiatric Sequelae
Figure 4 for Exploratory Hidden Markov Factor Models for Longitudinal Mobile Health Data: Application to Adverse Posttraumatic Neuropsychiatric Sequelae
Viaarxiv icon

Metadata-based Multi-Task Bandits with Bayesian Hierarchical Models

Add code
Aug 13, 2021
Figure 1 for Metadata-based Multi-Task Bandits with Bayesian Hierarchical Models
Figure 2 for Metadata-based Multi-Task Bandits with Bayesian Hierarchical Models
Figure 3 for Metadata-based Multi-Task Bandits with Bayesian Hierarchical Models
Viaarxiv icon