Adversarial Text Generation via Feature-Mover's Distance

Authors: Liqun Chen, Shuyang Dai, Chenyang Tao, Haichao Zhang, Zhe Gan, Dinghan Shen, Yizhe Zhang, Guoyin Wang, Ruiyi Zhang, Lawrence Carin

NeurIPS 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments are conducted on a variety of tasks to evaluate the proposed model empirically, including unconditional text generation, style transfer from non-parallel text, and unsupervised cipher cracking.
Researcher Affiliation Collaboration 1Duke University, 2Microsoft Dynamics 365 AI Research, 3Microsoft Research, 4Baidu Research
Pseudocode Yes Algorithm 1 IPOT algorithm [59] and Algorithm 2 Adversarial text generation via FMD.
Open Source Code No Our code will be released to encourage future research.
Open Datasets Yes CUB captions [57], MS COCO captions [38], and EMNLP2017 WMT News [24]. For unsupervised decipher task, we adapt the idea of feature mover s distance to the original framework of Cipher GAN and test this modified model on the Brown English text dataset [16] referencing The Brown English-language corpus [30].
Dataset Splits No Table 1 lists 'Train' and 'Test' dataset sizes but does not provide details on validation splits or percentages.
Hardware Specification No The paper does not specify any hardware details such as CPU, GPU models, or memory used for running the experiments.
Software Dependencies No The paper does not provide specific software dependencies or version numbers for any libraries, frameworks, or programming languages used.
Experiment Setup Yes Algorithm 2 lists 'batch size n, learning rate η, maximum number of iterations N' as inputs. Additionally, for conditional tasks, 'λ is a hyperparameter that balances these two terms'.