Multiplicative Filter Networks

Authors: Rizal Fathony, Anit Kumar Sahu, Devin Willmott, J Zico Kolter

ICLR 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We test MFNs on a broad range of representation tasks, showing that the relative simplicity of MFNs improves upon the performance of existing neural representation methods.
Researcher Affiliation Collaboration Rizal Fathony Bosch Center for Artificial Intelligence Pittsburgh, PA rizal.fathony@us.bosch.com Anit Kumar Sahu Amazon Alexa AI Seattle, WA anit.sahu@gmail.com Devin Willmott Bosch Center for Artificial Intelligence Pittsburgh, PA devin.willmott@us.bosch.com J. Zico Kolter Bosch Center for Artificial Intelligence Carnegie Mellon University Pittsburgh, PA zkolter@cs.cmu.edu
Pseudocode No The paper defines network architectures using mathematical equations, but does not present them in a structured pseudocode or algorithm block format.
Open Source Code Yes A Py Torch implementation of MFN is available at https: //github.com/boschresearch/multiplicative-filter-networks
Open Datasets Yes Our set of experiments draws from those presented in Sitzmann et al. (2020) alongside SIREN (image representation, shape representation, and differential equation experiments) and in Tancik et al. (2020) alongside Fourier feature networks with Gaussian random features, which we call FF Gaussian (image generalization and 3D inverse rendering experiments).
Dataset Splits No We train the networks using only 25% of the image pixels (every other pixel in the width and height dimensions) and evaluate using the complete images. This describes a train/test split, but no explicit validation split, nor specific split information for all datasets used.
Hardware Specification No The paper does not provide any specific hardware details such as GPU or CPU models used for running the experiments.
Software Dependencies No A Py Torch implementation of MFN is available at https: //github.com/boschresearch/multiplicative-filter-networks, and full details on hyperparameters and training specifications are available in the appendix.
Experiment Setup Yes full details on hyperparameters and training specifications are available in the appendix.