Mixture Manifold Networks: A Computationally Efficient Baseline for Inverse Modeling

Authors: Gregory P. Spell, Simiao Ren, Leslie M. Collins, Jordan M. Malof

AAAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We demonstrate the advantages of our method by comparing to several baselines on four benchmark inverse problems, and we furthermore provide analysis to motivate its design. We consider four benchmark tasks to demonstrate our MMN inverse modeling method; these are summarized in Table 1. Our main experimental results are presented in Table 2, which shows the re-simulation error for MMN and all baseline methods for the case of T = 1 solution proposals by each model.
Researcher Affiliation Academia 1Duke University, Department of Electrical & Computer Engineering, Durham, NC USA 2University of Montana, Department of Computer Science, Missoula, MT USA
Pseudocode No The paper describes the methods in narrative text and uses figures to illustrate architecture but does not contain a clearly labeled pseudocode or algorithm block.
Open Source Code Yes 2See: https://github.com/gspell/Mixture Manifold Networks
Open Datasets Yes We consider four benchmark tasks to demonstrate our MMN inverse modeling method; these are summarized in Table 1. For each inverse task, we use the same experimental designs as from previous benchmark studies, including the simulator (e.g., forward model) parameters, simulator sampling procedures, and training/testing splits. Details not included in Table 1 can be found in the studies cited above and in our supplemental material.1
Dataset Splits Yes Table 1: Summary of our inverse problem datasets. Note we cover both cases that Dimx > Dimy and Dimy > Dimx. Num. Train 8000 8000 8000 40,000 Num. Val. 2000 2000 2000 10,000 Num. Test 1000 1000 1000 500
Hardware Specification No The paper does not mention any specific hardware (e.g., GPU models, CPU types, or cloud instances) used for running the experiments.
Software Dependencies No The paper mentions implementation details are in supplementary material but does not list specific software dependencies with version numbers in the main text.
Experiment Setup Yes Model architectures and hyperparameters we selected to align with previous studies, and full implementation details can be found in the supplementary material. For MMN experimentation, we use the the same forward and backward model architectures as used for the Tandem model for each benchmark problem. We use K = 6 manifolds (backward models) for each MMN. Each of the K backward models were trained using only augmented data generated using the forward model sampling procedure described in the Methods section. For the Sine Wave, Robotic Arm, and Meta-Material benchmarks, we sampled 40,000 points with which to train, and for the Shell benchmark, we sampled 250,000.