Fair Classification with Adversarial Perturbations
Authors: L. Elisa Celis, Anay Mehrotra, Nisheeth Vishnoi
NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Empirically, we evaluate the classifiers produced by our framework for statistical rate on real-world and synthetic datasets for a family of adversaries. |
| Researcher Affiliation | Academia | L. Elisa Celis Yale University Anay Mehrotra Yale University Nisheeth K. Vishnoi Yale University |
| Pseudocode | No | The paper presents mathematical programs (Err Tolerant and Err Tolerant+) but does not include any structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide an explicit statement or link for the open-source code of the described methodology. |
| Open Datasets | Yes | We implement our framework for logistic loss function with linear classifiers and evaluate its performance on COMPAS [3], Adult [23], and a synthetic dataset (Section 5). |
| Dataset Splits | Yes | We use a randomly generated 70-30 train (S) test (T) split of the data, and generate the perturbed data b S from S for a (known) perturbation rate η. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., CPU/GPU models, memory specifications) used for running experiments. |
| Software Dependencies | Yes | In our simulations, we use the standard solver SLSQP in Sci Py [57] to heuristically find f ET; see Supplementary Material E.1. |
| Experiment Setup | Yes | We implement our framework for logistic loss function with linear classifiers and evaluate its performance on real world and synthetic data... We use a randomly generated 70-30 train (S) test (T) split of the data, and generate the perturbed data b S from S for a (known) perturbation rate η... We take gender (coded as binary) as the protected attribute, and set the fairness constraint on the statistical rate to be τ = 0.9 for Err-Tol and all baselines. |