Adaptive Conformal Inference Under Distribution Shift

Authors: Isaac Gibbs, Emmanuel Candes

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We test our method, adaptive conformal inference, on two real world datasets and find that its predictions are robust to visible and significant distribution shifts.
Researcher Affiliation Academia Isaac Gibbs Department of Statistics Stanford University igibbs@stanford.edu Emmanuel J. Candès Department of Statistics Department of Mathematics Stanford University candes@stanford.edu
Pseudocode No The paper describes algorithmic steps (e.g., 't+1 := t + γ( errt)') but does not present them within a clearly labeled 'Algorithm' or 'Pseudocode' block.
Open Source Code No The paper does not include any statement about releasing source code or provide links to a code repository for the described methodology.
Open Datasets Yes Daily open prices were obtained from publicly available datasets published by The Wall Street Journal.
Dataset Splits No The paper mentions a 'calibration set Dcal' and dynamic data usage (e.g., 'fit the model using only the last 1250 trading days') but does not specify fixed train/validation/test splits with percentages or sample counts.
Hardware Specification No The paper does not specify any hardware details (e.g., GPU/CPU models, memory) used for running the experiments.
Software Dependencies No The paper mentions statistical models and methods (e.g., 'GARCH(1,1) model', 'conformalized quantile regression (CQR)') but does not list specific software packages or libraries with version numbers used for implementation.
Experiment Setup Yes More precisely, for all times t > 1250 we fit the coefficients ˆ!t, ˆ t, ˆβt as well as the sequence of variances {ˆσts}1 s t 1 using only the data {Rr}t 1250 r<t.