ECLipsE: Efficient Compositional Lipschitz Constant Estimation for Deep Neural Networks

Authors: Yuezhu Xu, S Sivaranjani

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We implement our algorithms on randomly generated neural networks and ones trained on the MNIST dataset.
Researcher Affiliation Academia Yuezhu Xu Edwardson School of Industrial Engineering Purdue University West Lafayette, IN, USA xu1732@purdue.edu S. Sivaranjani Edwardson School of Industrial Engineering Purdue University West Lafayette, IN, USA sseetha@purdue.edu
Pseudocode Yes Algorithm 1 ECLips E and ECLips E-Fast
Open Source Code Yes https://github.com/Yuezhu Xu/ECLips E
Open Datasets Yes We implement our algorithms on randomly generated neural networks and ones trained on the MNIST dataset.
Dataset Splits No The paper does not specify validation dataset splits for either randomly generated networks or the MNIST dataset.
Hardware Specification Yes All experiments are implemented on a Windows laptop with a 12-core CPU with 16GB of RAM.
Software Dependencies No The paper does not specify versions for software dependencies such as programming languages or libraries used for implementation (e.g., Python, PyTorch).
Experiment Setup Yes For training on the dataset MNIST, ... We train neural networks using the SGD optimizer with a learning rate of 0.01 and momentum of 0.9... For randomly generated networks, ... we scale the weights on each layer such that the is norm randomly chosen in [0.4, 1.8], following a uniform distribution.