SEIT: Structural Enhancement for Unsupervised Image Translation in Frequency Domain
Authors: Zhifeng Zhu, Yaochen Li, Yifan Li, Jinhuo Yang, Peijun Chen, Yuehu Liu
AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | The extensive experimental results well demonstrate the effectiveness of the proposed method. |
| Researcher Affiliation | Academia | 1School of Software Engineering, Xi an Jiaotong University 2Institute of Artificial Intelligence and Robotics, Xi an Jiaotong University z1965761380@stu.xjtu.edu.cn, yaochenli@mail.xjtu.edu.cn, {3121358033, jinhuo, 3123358029}@stu.xjtu.edu.cn, liuyh@mail.xjtu.edu.cn |
| Pseudocode | No | The paper does not contain structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide an explicit statement or link for open-source code availability for the described methodology. |
| Open Datasets | Yes | The datasets we use include SYNTHIA(Ros et al. 2016), GTA5(Richter et al. 2016), Cityscapes(Cordts et al. 2015) and BDD(Yu et al. 2020). |
| Dataset Splits | No | The paper does not provide specific train/validation/test dataset splits, percentages, or explicit methodology for data partitioning. |
| Hardware Specification | Yes | All experiments are conducted on a single RTX 3090 GPU. |
| Software Dependencies | No | The paper mentions software components like 'Adam optimizer' and 'VGG network' but does not provide specific version numbers for any programming languages, libraries, or frameworks used for implementation. |
| Experiment Setup | Yes | The batch size is set to 1. We use the Adam optimizer with β1 = 0.5 and β2 = 0.999. The initial learning rate is set to 0.0002 and the step decay learning strategy is used, with the learning rate decaying to half of the original learning rate every 5 epochs. The model is trained for 100 epochs. Following previous work, the loss weight in equation 14 is set to 1.0, 2.0, and 1.0, respectively. |