StrokeGAN: Reducing Mode Collapse in Chinese Font Generation via Stroke Encoding

Authors: Jinshan Zeng, Qi Chen, Yunxin Liu, Mingwen Wang, Yuan Yao3270-3277

AAAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental The effectiveness of Stroke GAN is demonstrated by a series of generation tasks over nine datasets with different fonts. The numerical results demonstrate that Stroke GAN generally outperforms the state-of-the-art methods in terms of content and recognition accuracies, as well as certain stroke error, and also generates more realistic characters.
Researcher Affiliation Academia Jinshan Zeng1, Qi Chen1, Yunxin Liu1, Mingwen Wang1 , Yuan Yao2 1 School of Computer and Information Engineering, Jiangxi Normal University, Nanchang, China 2 Department of Mathematics, Hong Kong University of Science and Technology, Hong Kong
Pseudocode No The paper describes the model architecture and training process in text and mathematical formulas, but it does not contain any structured pseudocode or algorithm blocks.
Open Source Code Yes Our codes are available in https://github.com/Jinshan Zeng/Stroke GAN.
Open Datasets Yes The first kind of dataset related to the handwriting Chinese characters is built up from CASIA-HWDB1.1 1, which 1http://www.nlpr.ia.ac.cn/databases/handwriting/Home.html was collected by 300 people.
Dataset Splits No In our experiments, we used 90% and 10% of the samples respectively as the training and test sets. No explicit mention of a validation set split was found.
Hardware Specification Yes All experiments were carried out in Pytorch environment running Linux, AMD(R) Ryzen 7 2700x eight-core processor 16 CPU, Ge Force RTX 2080 GPU.
Software Dependencies No The paper states that experiments were carried out in a "Pytorch environment" and "running Linux," but it does not specify version numbers for PyTorch, Linux, or any other software dependencies.
Experiment Setup Yes In our experiments, we used the popular Adam algorithm (Kingma and Ba 2014) as the optimizer with the associated parameters (0.5, 0.999) in both the generator and discriminator optimization subproblems. The penalty parameters of the cycle consistency loss and stroke reconstruction loss were fine-tuned at 10 and 0.18, respectively.