ADEPT: A DEbiasing PrompT Framework
Authors: Ke Yang, Charles Yu, Yi R. Fung, Manling Li, Heng Ji
AAAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We evaluate ADEPT on several widely acknowledged debiasing benchmarks and downstream tasks, and find that it achieves competitive results while maintaining (and in some cases even improving) the PLM s representation ability. |
| Researcher Affiliation | Academia | Ke Yang1, Charles Yu2, Yi R. Fung2, Manling Li2, Heng Ji2 1Tsinghua University 2University of Illinois Urbana-Champaign |
| Pseudocode | Yes | Algorithm 1: ADEPT: a debiasing algorithm for contextualized word embeddings. |
| Open Source Code | Yes | 1The code and data are publicly available at https://github.com/ Empath Yang/ADEPT. |
| Open Datasets | Yes | For the sentences associated with the word tuples, we draw sentences from News-Commentary v15 (Tiedemann 2012) for the gender setting and sentences from Book Corpus (Zhu et al. 2015) and News-Commentary v15 (Tiedemann 2012) for the relgions setting. Since the original Book Corpus is no longer available, we use (lewtun et al. 2022) which is an open source replica. |
| Dataset Splits | No | The paper explicitly mentions test examples for evaluation but does not provide specific details on a validation dataset split used during the training or hyperparameter tuning of ADEPT itself. |
| Hardware Specification | Yes | All the experiments are conducted on two Ge Force RTX 3090 GPUs and in a Linux operating system. |
| Software Dependencies | No | The paper mentions using BERT-LARGE-UNCASED from Hugging Face and the Adam optimizer but does not provide specific version numbers for these or other software dependencies. |
| Experiment Setup | Yes | We set λ in Equation 1 to be 7 and ρ in Equation 4 to be 15. We use Adam (Kingma and Ba 2014) to optimize the objective function. During the debiasing process, our learning rate is 5e-5 and our batchsize is 32. |