Rashomon Capacity: A Metric for Predictive Multiplicity in Classification
Authors: Hsiang Hsu, Flavio Calmon
NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our numerical experiments illustrate how Rashomon Capacity captures predictive multiplicity in various datasets and learning models, including neural networks. |
| Researcher Affiliation | Academia | Hsiang Hsu and Flavio P. Calmon John A. Paulson School of Engineering and Applied Sciences, Harvard University hsianghsu@g.harard.edu, flavio@seas.harvard.edu |
| Pseudocode | Yes | We describe next a method (described in detail in Algorithm SM. 2) based on weight perturbation that obtains c models in the Rashomon subset for each sample. |
| Open Source Code | Yes | Code to reproduce our experiments is available at https://github.com/HsiangHsu/rashomon-capacity. |
| Open Datasets | Yes | We illustrate how to measure, report, and potentially resolve predictive multiplicity of probabilistic classifiers using Rashomon Capacity on UCI Adult [25], COMPAS [26], HSLS [27], and CIFAR-10 datasets [8]. |
| Dataset Splits | No | The paper mentions 'Each point is generated with 5 repeated splits of the dataset' and refers to 'training details' in the Supplementary Materials, but does not provide specific percentages or absolute counts for train/validation/test splits in the main text. |
| Hardware Specification | No | The paper states 'Did you include the total amount of compute and the type of resources used (e.g., type of GPUs, internal cluster, or cloud provider)? [Yes] Provided in the SM.' However, specific hardware details like GPU/CPU models are not provided in the main text. |
| Software Dependencies | No | The paper states 'Did you specify all the training details (e.g., data splits, hyperparameters, how they were chosen)? [Yes] All provided in the SM.' However, specific software dependencies with version numbers are not provided in the main text. |
| Experiment Setup | No | The paper states 'For more information on the datasets, neural network architectures, and training details, see Section SM. 3.2.' It defers detailed experimental setup, including hyperparameters, to the Supplementary Materials and does not provide them in the main text. |