Aligning Model Properties via Conformal Risk Control
Authors: William Overman, Jacqueline Vallon, Mohsen Bayati
NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We exhibit applications of our methodology on a collection of supervised learning datasets for (shape-constrained) properties such as monotonicity and concavity. and We demonstrate our methodology on real-world datasets for the properties of monotonicity and concavity. also Results. Table 4.1 presents results on the test set for the Combined Cycle Power Plant dataset (Tfekci and Kaya, 2014). |
| Researcher Affiliation | Academia | 1 Stanford Graduate School of Business 2 Management Science and Engineering {wpo,bayati}@stanford.edu, jjvallon@alumni.stanford.edu |
| Pseudocode | Yes | Algorithm 1 POT T for property P of monotonically decreasing in dimension k and Algorithm 2 POT T for property P of concavity in dimension k |
| Open Source Code | No | The paper states in the NeurIPS checklist (Question 5: Open access to data and code): 'Justification: The paper provides links to the UCI ML repository for the datasets used, and the code used for experiments will be made available upon publication. Sufficient instructions are included for reproducing the experiments.' |
| Open Datasets | Yes | We align for monotonicity on various UCI ML repository datasets (Dua and Graff, 2023) with a 70-15-15 train-calibrate-test split, averaged over 30 random splits. |
| Dataset Splits | Yes | We align for monotonicity on various UCI ML repository datasets (Dua and Graff, 2023) with a 70-15-15 train-calibrate-test split, averaged over 30 random splits. |
| Hardware Specification | No | The NeurIPS checklist (Question 8) mentions 'The paper specifies the use of XGBoost models and mentions typical execution times and resources used for the experiments. See the extended version of the paper.' However, the main paper does not provide specific hardware details. |
| Software Dependencies | No | The paper mentions 'XGBoost regression models (Chen and Guestrin, 2016)' but does not specify any software versions for XGBoost or other dependencies. |
| Experiment Setup | Yes | Setup. We align for monotonicity on various UCI ML repository datasets (Dua and Graff, 2023) with a 70-15-15 train-calibrate-test split, averaged over 30 random splits. We use XGBoost regression models (Chen and Guestrin, 2016). and Table 1: Power Plant, n = 9568. Monotonically decreasing on Exhaust Vacuum. λmax = (10, 10). It also specifies α values: 0.1, 0.05, 0.01. |