Adaptive wavelet distillation from neural networks through interpretations

Authors: Wooseok Ha, Chandan Singh, Francois Lanusse, Srigokul Upadhyayula, Bin Yu

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In close collaboration with domain experts, we showcase how AWD addresses challenges in two real-world settings: cosmological parameter inference and molecular-partner prediction. In both cases, AWD yields a scientifically interpretable and concise model which gives predictive performance better than state-of-the-art neural networks. Moreover, AWD identifies predictive features that are scientifically meaningful in the context of respective domains. All code and models are released in a full-fledged package available on Github. 1
Researcher Affiliation Academia 1 Statistics Department, UC Berkeley 2 EECS Department, UC Berkeley 3 AIM, CEA, CNRS; Université Paris-Saclay, Université Paris Diderot, Sorbonne Paris Cité 4 Advanced Bioimaging Center, Department of Molecular & Cell Biology, UC Berkeley
Pseudocode No The paper does not contain any structured pseudocode or algorithm blocks.
Open Source Code Yes All code and models are released in a full-fledged package available on Github. 1 github.com/Yu-Group/adaptive-wavelets
Open Datasets Yes We use a recently published dataset [50] which tags two molecules: clathrin light chain A, which is used as the predictor variable, and auxilin 1, the target variable. ... We train a DNN7 to predict m from 100,000 mass maps simulated with 10 different sets of cosmological parameter values at the universe origin from the Massive Nu S simulations [62] (full simulation details given in Appendix D).
Dataset Splits Yes The hyperparameters for AWD are selected by evaluating the predictive model s performance on a held-out validation set.
Hardware Specification No The paper does not provide specific details about the hardware used for running the experiments beyond a general acknowledgment of "AWS computing credits".
Software Dependencies No The paper mentions "Py Wavelets package [44]" and "Pytorch Wavelets [45, Chapter 3] package" but does not specify their version numbers, which are necessary for reproducibility.
Experiment Setup Yes Fig 2 shows the best learned wavelet (for one particular run) extracted by AWD corresponding to the setting of hyperparameters λ = 0.005 and γ = 0.043. ... The hyperparameters for AWD are selected by evaluating the predictive model s performance on a held-out validation set.