Analyzing Inverse Problems with Invertible Neural Networks

Authors: Lynton Ardizzone, Jakob Kruse, Carsten Rother, Ullrich Köthe

ICLR 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We prove theoretically and verify experimentally, on artificial data and real-world problems from medicine and astrophysics, that INNs are a powerful analysis tool to find multi-modalities in parameter space, uncover parameter correlations, and identify unrecoverable parameters.
Researcher Affiliation Academia 1Visual Learning Lab Heidelberg, 2German Cancer Research Center (DKFZ), 3Zentrum für Astronomie der Universität Heidelberg (ZAH)
Pseudocode No The paper describes algorithms and architectures in text and equations but does not include any explicitly labeled or formatted pseudocode or algorithm blocks.
Open Source Code No The paper does not contain a statement explicitly releasing source code for the methodology or providing a link to a code repository.
Open Datasets No For the medical and astrophysics applications, the paper states that training data was created by simulation ('we create training data by simulating observed spectra y from a tissue model x' and 'We train an INN on simulated data'). For artificial data, it describes generating it ('Gaussian mixture model', 'inverse kinematics problem'). However, it does not provide concrete access information (link, DOI, formal citation) for a publicly available or open dataset.
Dataset Splits No The paper mentions 'training data' and 'test set observations' but does not specify a separate 'validation' set or explicit percentages for training, validation, and test splits needed for reproducibility.
Hardware Specification No The paper mentions 'taking one week on a GPU' for simulations but does not specify the model or detailed specifications of the GPU or any other hardware used for experiments.
Software Dependencies No The paper mentions software components like 'Adam optimizer' and activation functions like 'Re LU' but does not provide specific version numbers for any software libraries or frameworks used (e.g., Python, TensorFlow, PyTorch versions).
Experiment Setup Yes INN: 3 invertible blocks, 3 fully connected layers per affine coefficient function with Re LU activation functions in the intermediate layers, zero padding to a nominal dimension of 16, Adam optimizer, decaying learning rate from 10 3 to 10 5, batch size 200.