Synthetic Disinformation Attacks on Automated Fact Verification Systems

Authors: Yibing Du, Antoine Bosselut, Christopher D. Manning10581-10589

AAAI 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Our study across multiple models on three benchmarks demonstrates that these systems suffer significant performance drops against these attacks.
Researcher Affiliation Academia 1Stanford University 2EPFL
Pseudocode No No pseudocode or algorithm blocks were found in the paper.
Open Source Code Yes Our code can be found at: https://github.com/Yibing-Du/adversarial-factcheck
Open Datasets Yes FEVER The FEVER testbed (Thorne et al. 2018) is a dataset of 185,445 claims (145,449 train, 19,998 dev, 19,998 test) with corresponding evidence to validate them drawn from articles in Wikipedia.
Dataset Splits Yes FEVER The FEVER testbed (Thorne et al. 2018) is a dataset of 185,445 claims (145,449 train, 19,998 dev, 19,998 test) with corresponding evidence to validate them drawn from articles in Wikipedia.
Hardware Specification No No specific hardware details such as GPU models, CPU types, or memory specifications used for running experiments were mentioned in the paper.
Software Dependencies No The paper mentions software components like BERT-based models, RoBERTa-based models, GROVER, and PEGASUS, but does not provide specific version numbers for these or other ancillary software libraries.
Experiment Setup Yes Our method, ADVERSARIAL ADDITION (ADVADD), uses GROVER to produce synthetic documents for a proposed claim, and makes these fake documents available to the fact verification system when retrieving evidence. As GROVER requires a proposed article title and publication venue (i.e., website link) as input to generate a fake article, we use each claim as a title and set the article venue to wikipedia.com. We generate 10 articles for each claim and split them into paragraphs (n.b., FEVER DB contains first paragraphs of Wikipedia articles and SCIFACT contains abstracts of scientific articles).