An effective framework for estimating individualized treatment rules

Authors: Joowon Lee, Jared Huling, Guanhua Chen

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive simulations and applications demonstrate that our framework achieves significant gains in both robustness and effectiveness for ITR learning against existing methods.
Researcher Affiliation Academia 1 University of Wisconsin-Madison, 2 University of Minnesota joowon.lee@wisc.edu, huling@umn.edu, gchen25@wisc.edu
Pseudocode Yes Algorithm 1 Projected Gradient Descent Algorithm to Estimate Decision Function for ITR-Learning
Open Source Code Yes The code supporting this study is available at https://github.com/ljw9510/effective-ITR, with plans for release as an R package soon.
Open Datasets Yes We apply the proposed methods to two datasets from AIDS Clinical Trials Group (ACTG) 175 [21] and email marketing [22].
Dataset Splits Yes Similar to [44], we randomly split the data into a training set of {200, 400, 800, 1000, 1200} observations for the ACTG dataset and {1000, 3000, 5000} observations for the email dataset. The remaining observations were used for test data with 10 iterations.
Hardware Specification Yes All numerical experiments were performed on a 2022 Macbook Air with M1 chip and 16 GB of RAM.
Software Dependencies No The paper mentions 'random Forest package in R' but does not specify version numbers for R or the package itself, which is required for a reproducible description of software dependencies.
Experiment Setup Yes Specifically, we use following treatment-free effect function µ(X) and interaction effect function δ(X) for each scenario: 1. Randomized Trial: Linear ITR as the true optimal µ(X) = 1 + 2X1 + 2X2, δ(X) = 0.75 + 1.5X1 + 1.5X2 + 1.5X3 + 1.5X4, A = 1; 0.75 + 1.5X1 1.5X2 1.5X3 + 1.5X4, A = 2; 0.75 + 1.5X1 1.5X2 + 1.5X3 1.5X4, A = 3; 0.75 1.5X1 + 1.5X2 1.5X3 1.5X4, A = 4, ... The iterate number T of the PGD algorithm is 1000.