Kernel of CycleGAN as a principal homogeneous space

Authors: Nikita Moriakov, Jonas Adler, Jonas Teuwen

ICLR 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We show theoretically that the exact solution space is invariant with respect to automorphisms of the underlying probability spaces, and, furthermore, that the group of automorphisms acts freely and transitively on the space of exact solutions. [...] We proceed in section 3 by showing that unexpected symmetries can be learned by a Cycle GAN. In particular, when translating the same domain to itself Cycle GAN can learn a nontrivial automorphism of the domain. [...] 3 NUMERICAL RESULTS [...] We present some additional results on the BRATS2015 dataset.
Researcher Affiliation Collaboration Nikita Moriakov Radiology, Nuclear Medicine and Anatomy Radboud University Medical Center nikita.moriakov@radboudumc.nl Jonas Adler Department of Mathematics KTH Royal Institute of Technology Research and Physics Elekta jonasadl@kth.se Jonas Teuwen Radiology, Nuclear Medicine and Anatomy Radboud University Medical Center Department of Radiation Oncology Netherlands Cancer Institute jonas.teuwen@radboudumc.nl
Pseudocode No No pseudocode or algorithm blocks were found in the paper.
Open Source Code No The paper does not explicitly state that source code for the described methodology is publicly available, nor does it provide a link to a repository.
Open Datasets Yes The toy experiment which we perform is translation of MNIST dataset to itself. We provide additional experiments on BRATS2015 dataset in appendix B
Dataset Splits No The paper does not explicitly provide specific train/validation/test dataset splits (e.g., percentages or sample counts), nor does it reference predefined splits with citations.
Hardware Specification No No specific hardware details (e.g., GPU/CPU models, memory) used for running experiments were mentioned in the paper.
Software Dependencies No The paper mentions optimizers like SGD and Adam, but does not provide specific software library names with version numbers (e.g., 'PyTorch 1.9' or 'TensorFlow 2.0') required for replication.
Experiment Setup Yes For this experiment Unet-based generators with residual connections were used. The number of downsampling layers was 4 for both generators, and skip connections were preserved. We trained all models for 20 epochs with Adam optimizer and learning rate 0.0002. We trained 4 models with αid {0.0, 10.0, 20.0, 40.0}. No data augmentation was used so as to avoid creating any additional symmetries. All images were normalized by dividing by the 95%-percentile, as is common in medical imaging when working with MR data.