A Reductions Approach to Fair Classification

Authors: Alekh Agarwal, Alina Beygelzimer, Miroslav Dudik, John Langford, Hanna Wallach

ICML 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental 4. Experimental Results We now examine how our exponentiated-gradient reduction performs at the task of binary classification subject to either demographic parity or equalized odds. We provide an evaluation of our grid-search reduction in Appendix D. We compared our reduction with the score-based postprocessing algorithm of Hardt et al. (2016)... We used four data sets, randomly splitting each one into training examples (75%) and test examples (25%):
Researcher Affiliation Industry 1Microsoft Research, New York 2Yahoo! Research, New York.
Pseudocode Yes Algorithm 1 Exp. gradient reduction for fair classification
Open Source Code Yes 5https://github.com/Microsoft/fairlearn
Open Datasets Yes The adult income data set (Lichman, 2013)... Pro Publica s COMPAS recidivism data... Law School Admissions Council s National Longitudinal Bar Passage Study (Wightman, 1998)... The Dutch census data set (Dutch Central Bureau for Statistics, 2001)
Dataset Splits Yes We used four data sets, randomly splitting each one into training examples (75%) and test examples (25%):
Hardware Specification No The paper does not explicitly describe the hardware (e.g., specific GPU/CPU models, memory) used for running its experiments.
Software Dependencies No The paper mentions 'scikit-learn' as a base classifier implementation but does not specify a version number for this or any other software dependency.
Experiment Setup Yes We considered ε {0.001, . . . , 0.1} and for each value ran Algorithm 1 with bck = ε across all k.