RMGN: A Regional Mask Guided Network for Parser-free Virtual Try-on

Authors: Chao Lin, Zhao Li, Sheng Zhou, Shichang Hu, Jialun Zhang, Linhao Luo, Jiarun Zhang, Longtao Huang, Yuan He

IJCAI 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments demonstrate that our proposed RMGN outperforms both state-of-the-art PB and PF methods. Ablation studies further verify the effectiveness of modules in RMGN.
Researcher Affiliation Collaboration 1Alibaba Group 2Zhejiang University 3Link2Do Technology Ltd 4Monash University 5University of California San Diego
Pseudocode No The paper describes its network architecture and processes textually and with diagrams, but it does not include any formal pseudocode or algorithm blocks.
Open Source Code Yes Code is available at https://github.com/jokerlc/RMGN-VITON.
Open Datasets Yes We conduct experiments on two public datasets: VITON [Han et al., 2018] and MPV [Dong et al., 2019], which are widely applied by recent researches in this field.
Dataset Splits No The paper mentions "training/testing pairs" for the datasets but does not explicitly provide details for a separate validation split.
Hardware Specification No The paper does not specify any particular hardware (e.g., GPU models, CPU types, or memory) used for running the experiments.
Software Dependencies No The paper mentions using VGG19 pre-trained on ImageNet, but it does not list specific software dependencies with version numbers (e.g., Python, PyTorch, TensorFlow, CUDA versions).
Experiment Setup No The paper describes the loss functions and general architecture but does not provide specific hyperparameter values (e.g., learning rate, batch size, number of epochs) or detailed training configurations in the main text.