Perturbation Analysis of Neural Collapse
Authors: Tom Tirer, Haoxiang Huang, Jonathan Niles-Weed
ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We support our theory with experiments in practical deep learning settings. |
| Researcher Affiliation | Academia | 1Faculty of Engineering, Bar-Ilan University, Ramat Gan, Israel 2Courant Institute of Mathematical Sciences, New York University, NY, US. |
| Pseudocode | No | The paper does not contain any sections or figures explicitly labeled 'Pseudocode' or 'Algorithm'. |
| Open Source Code | No | The paper does not provide any statements about releasing code for the methodology or links to a code repository. |
| Open Datasets | Yes | We consider the CIFAR-10 dataset and train an MLP... We consider the CIFAR-10 dataset and examine how modifying the regularization hyperparameters affects the NC behavior of the widely used Res Net18 (He et al., 2016a)... In Figure 4 we consider the MNIST dataset with 3K training samples per class. |
| Dataset Splits | No | The paper mentions using 'training samples per class' for CIFAR-10 and MNIST, but it does not explicitly state the dataset splits (e.g., specific percentages for training, validation, and testing sets, or refer to standard predefined splits). |
| Hardware Specification | No | The paper does not provide specific hardware details such as GPU models, CPU types, or memory specifications used for running the experiments. |
| Software Dependencies | No | The paper mentions 'default Py Torch initialization' but does not specify version numbers for PyTorch or any other software dependencies. |
| Experiment Setup | Yes | Specifically, as a baseline hyperparameter setting, we consider one that is used in previous works (Papyan et al., 2020; Zhu et al., 2021): default Py Torch initialization of the weights, SGD optimizer with LR 0.05 that is divided by 10 every 40 epochs, momentum of 0.9, and WD of 5e-4 for all the network s parameters. |