Phase Collapse in Neural Networks

Authors: Florentin Guth, John Zarka, Stéphane Mallat

ICLR 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Table 1: Error of linear classifiers applied to a scattering (Scat), learned scattering (LScat) and learned scattering with skip connections (+ skip), on CIFAR-10 and Image Net. The last column gives the single-crop error of Res Net-20 for CIFAR-10 and Res Net-18 for Image Net, taken from https://pytorch.org/vision/stable/models.html.
Researcher Affiliation Collaboration Florentin Guth, John Zarka DI, ENS, CNRS, PSL University, Paris, France {florentin.guth,john.zarka}@ens.fr Stéphane Mallat Collège de France, Paris, France Flatiron Institute, New York, USA
Pseudocode No The paper does not contain structured pseudocode or algorithm blocks.
Open Source Code Yes The code to reproduce the experiments of the paper is available at https://github.com/ Florentin Guth/Phase Collapse.
Open Datasets Yes This section introduces a learned scattering transform, which is a highly structured CNN architecture relying on phase collapses and reaching Res Net accuracy on the Image Net (Russakovsky et al., 2015) and CIFAR-10 (Krizhevsky, 2009) datasets.
Dataset Splits Yes Classification error on Image Net validation set is computed on a single center crop of size 224.
Hardware Specification Yes All experiments ran during the preparation of this paper, including preliminary ones, required around 10k 32GB NVIDIA V100 GPU-hours.
Software Dependencies No The paper mentions using the 'Kymatio package' and 'SGD' optimizer but does not provide specific version numbers for these or other key software components.
Experiment Setup Yes We use the optimizer SGD with an initial learning rate of 0.01, a momentum of 0.9, a weight decay of 0.0001, and a batch size of 128.