Localized Adaptive Risk Control
Authors: Matteo Zecchin, Osvaldo Simeone
NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In Section 3 we showcase the superior conditional risk control properties of L-ARC as compared to ARC for the task of electricity demand forecasting, tumor segmentation, and beam selection in wireless networks. |
| Researcher Affiliation | Academia | Matteo Zecchin Osvaldo Simeone Centre for Intelligent Information Processing Systems Department of Engineering King s College London London, United Kingdom {matteo.1.zecchin,osvaldo.simeone}@kcl.ac.uk |
| Pseudocode | No | The paper does not contain structured pseudocode or algorithm blocks. The update rules are described as formulas within the text. |
| Open Source Code | Yes | The simulation code is available at https://github.com/kclip/localized-adaptive-risk-control.git. |
| Open Datasets | Yes | Firstly, we address the task of electricity demand forecasting, utilizing data from the Elec2 dataset [Harries et al., 1999]. Next, we present an experiment focusing on tumor segmentation, where the data comprises i.i.d. samples drawn from various image datasets [Jha et al., 2020, Bernal et al., 2015, 2012, Silva et al., 2014, Vázquez et al., 2017]. We reserve 50 samples from each repository for testing the performance post-calibration, while the remaining T = 2098 samples are used for online calibration. In this section, we consider an image classification task under calibration requirements based on the fruit-360 dataset [Muresan and Oltean, 2018]. |
| Dataset Splits | No | The paper mentions training and testing data but does not explicitly describe validation splits or cross-validation setups. The term "validation" is used in the context of the prediction sets, not data splitting. |
| Hardware Specification | Yes | All the experiments are conducted on a consumer-grade Mac Mini with an M1 chip. |
| Software Dependencies | No | The paper mentions specific models like Res Net and Pra Net but does not provide specific version numbers for software libraries or dependencies used for implementation or experimentation. |
| Experiment Setup | Yes | Unless stated otherwise, we instantiate L-ARC with the RBF kernel k(x, x ) = κ exp( x x 2 /l) with κ = 1, length scale l = 1 and regularization parameter λ = 10 4. Both ARC and L-ARC use the learning rate ηt = t 1/2. L-ARC is instantiated with the RBF kernel k(x, x ) = κ exp( ϕ(x) ϕ(x ) 2 /l), where ϕ(x) is a 7-dimensional feature vector corresponding to the daily average electricity demand during the past 7 days. Both ARC and L-ARC are run using the same decaying learning rate ηt = 0.1t 1/2. L-ARC is instantiated with the RBF kernel k(x, x ) = κ exp( ϕ(x) ϕ(x ) 2 /l), where ϕ(x) is a 5-dimensional feature vector obtained via the principal component analysis (PCA) from the last hidden layer of the Res Net model used in Pra Net. |