Patch-Wise Graph Contrastive Learning for Image Translation

Authors: Chanyong Jung, Gihyun Kwon, Jong Chul Ye

AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimental results demonstrate the state-of-art results for the image translation thanks to the semantic encoding by the constructed graphs. Experimental results in five different datasets demonstrates the state-of-the-art performance by producing semantically meaningful graphs.
Researcher Affiliation Academia Chanyong Jung1, Gihyun Kwon1, Jong Chul Ye1, 2 1 Department of Brain and Bio Engineering, KAIST, Daejeon, Republic of Korea 2 Kim Jaechul Graduate School of AI, KAIST, Daejeon, Republic of Korea
Pseudocode No None found. The paper describes its methods but does not include any structured pseudocode or algorithm blocks.
Open Source Code No None found. The paper does not provide any statement or link regarding the public release of source code for the methodology.
Open Datasets No We verify our method using the five datasets as follows: horse zebra, Label Cityscape, map satellite, summer winter, and apple orange. All images are resized into 256 256 for training and testing.
Dataset Splits No All images are resized into 256 256 for training and testing.
Hardware Specification No None found. The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running experiments.
Software Dependencies No None found. The paper does not provide specific software dependencies with version numbers (e.g., Python, PyTorch, TensorFlow versions, or specific library versions).
Experiment Setup Yes All images are resized into 256 256 for training and testing. For the graph construction, we randomly sampled 256 different patches from the pre-trained VGG16 (Simonyan and Zisserman 2014) network in both of input and output images. We extract the dense feature from the three different layers (relu3-1, relu4-1, relu4-3layer) inside of the network. For the graph operation, we set the number of GNN hops as 2, and pooling number as 1. For the graph pooling, we downsampled nodes by 1/4. In other words, we have 256 nodes in the initial graph, and 64 nodes for the pooled graph.