Understanding Instance-Level Impact of Fairness Constraints
Authors: Jialu Wang, Xin Eric Wang, Yang Liu
ICML 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We demonstrate with extensive experiments that training on a subset of weighty data examples leads to lower fairness violations with a trade-off of accuracy. (Abstract) and In this section, we examine the influence score subject to parity constraints on three different application domains: tabular data, images and natural language. (Section 6) |
| Researcher Affiliation | Academia | Department of Computer Science and Engineering, University of California, Santa Cruz, CA, USA. |
| Pseudocode | No | The paper does not contain any structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | We publish the source code at https://github.com/UCSC-REAL/Fair Infl. |
| Open Datasets | Yes | Firstly, we work with multi-layer perceptron (MLP) trained on the Adult dataset (Dua & Graff, 2017)., Next, we train a Res Net-18 network (He et al., 2015) on the Celeb A face attribute dataset (Liu et al., 2015)., Lastly, we consider Jigsaw Comment Toxicity Classification (Jigsaw, 2018) with text data. |
| Dataset Splits | No | The paper describes training and test splits for the datasets but does not explicitly mention a validation dataset split ratio or strategy. |
| Hardware Specification | Yes | For all the experiments, we use a GPU cluster with four NVIDIA RTX A6000 GPUs for training and evaluation. |
| Software Dependencies | No | The paper mentions optimizers (Adam) and pre-trained models (BERT) but does not provide specific version numbers for any software dependencies or libraries. |
| Experiment Setup | Yes | We used the Adam optimizer with a learning rate of 0.001 to train all the models. We used γ = 1 for models requiring the regularizer parameter of fairness constraints. (Section 6.1) and The MLP model is a two-layer Re LU network with hidden size 64. (Section 6.2) |