Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].

ECLipsE: Efficient Compositional Lipschitz Constant Estimation for Deep Neural Networks

Authors: Yuezhu Xu, S Sivaranjani

NeurIPS 2024 | Venue PDF | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We implement our algorithms on randomly generated neural networks and ones trained on the MNIST dataset.
Researcher Affiliation Academia Yuezhu Xu Edwardson School of Industrial Engineering Purdue University West Lafayette, IN, USA EMAIL S. Sivaranjani Edwardson School of Industrial Engineering Purdue University West Lafayette, IN, USA EMAIL
Pseudocode Yes Algorithm 1 ECLips E and ECLips E-Fast
Open Source Code Yes https://github.com/Yuezhu Xu/ECLips E
Open Datasets Yes We implement our algorithms on randomly generated neural networks and ones trained on the MNIST dataset.
Dataset Splits No The paper does not specify validation dataset splits for either randomly generated networks or the MNIST dataset.
Hardware Specification Yes All experiments are implemented on a Windows laptop with a 12-core CPU with 16GB of RAM.
Software Dependencies No The paper does not specify versions for software dependencies such as programming languages or libraries used for implementation (e.g., Python, PyTorch).
Experiment Setup Yes For training on the dataset MNIST, ... We train neural networks using the SGD optimizer with a learning rate of 0.01 and momentum of 0.9... For randomly generated networks, ... we scale the weights on each layer such that the is norm randomly chosen in [0.4, 1.8], following a uniform distribution.