Fractal Structure and Generalization Properties of Stochastic Optimization Algorithms

Authors: Alexander Camuto, George Deligiannidis, Murat A. Erdogdu, Mert Gurbuzbalaban, Umut Simsekli, Lingjiong Zhu

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental For modern neural networks, we develop an efficient algorithm to compute the developed bound and support our theory with various experiments on neural networks.
Researcher Affiliation Academia 1: University of Oxford & Alan Turing Institute 2: University of Toronto & Vector Institute 3: Rutgers Business School 4: INRIA & École Normale Supérieure PSL Research University 5: Florida State University
Pseudocode No The paper does not contain any structured pseudocode or algorithm blocks.
Open Source Code Yes Our implementation is available at https://github.com/umutsimsekli/fractal_generalization.
Open Datasets Yes trained on CIFAR10, SVHN and Boston House Prices (BHP).
Dataset Splits Yes Did you specify all the training details (e.g., data splits, hyperparameters, how they were chosen)? [Yes] See Section S7 in the Supplementary Document
Hardware Specification Yes Did you include the total amount of compute and the type of resources used (e.g., type of GPUs, internal cluster, or cloud provider)? [Yes] See Section S7 in the Supplementary Document
Software Dependencies No The paper mentions 'Py Hessian' as an external codebase used but does not provide specific version numbers for all key software dependencies.
Experiment Setup Yes Did you specify all the training details (e.g., data splits, hyperparameters, how they were chosen)? [Yes] See Section S7 in the Supplementary Document