Semialgebraic Optimization for Lipschitz Constants of ReLU Networks

Authors: Tong Chen, Jean B. Lasserre, Victor Magron, Edouard Pauwels

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In this section, we provide results for the global and local Lipschitz constants of random networks of fixed size (80, 80) and with various sparsities. We also compute bounds of a real trained 1-hidden layer network. The complete results for global/local Lipschitz constants of both 1-hidden layer and 2-hidden layer networks can be found in Appendix F and G. For all experiments we focus on the L -norm, the most interesting case for robustness certification.
Researcher Affiliation Academia Tong Chen LAAS-CNRS Université de Toulouse 31400 Toulouse, France tchen@laas.fr Jean-Bernard Lasserre LAAS-CNRS & IMT Université de Toulouse 31400 Toulouse, France lasserre@laas.fr Victor Magron LAAS-CNRS Université de Toulouse 31400 Toulouse, France vmagron@laas.fr Edouard Pauwels IRIT & IMT Université de Toulouse 31400 Toulouse, France edouard.pauwels@irit.fr
Pseudocode No No structured pseudocode or algorithm blocks were found in the paper.
Open Source Code No The paper mentions using and implementing algorithms with third-party tools (Python code by [21], YALMIP, Julia with MOSEK) but does not provide a specific link or explicit statement about releasing their own source code for their methodology.
Open Datasets Yes We use the MNIST classifier (SDP-NN) described in [30]4. The network is of size (784, 500).
Dataset Splits No The paper mentions '10000 MNIST test data' and refers to a 'trained network' but does not specify training, validation, and test splits (e.g., percentages or counts) for their experiments.
Hardware Specification Yes All experiments are run on a personal laptop with a 4-core i5-6300HQ 2.3GHz CPU and 8GB of RAM.
Software Dependencies No The paper mentions software like Python, YALMIP (MATLAB), Julia, Gurobi, and MOSEK but does not provide specific version numbers for them. For example, it mentions 'Julia [4]' and 'YALMIP [23]' but these citations refer to papers introducing the tools, not specific version numbers used for the experiments.
Experiment Setup Yes In consideration of numerical issues, we set Ωto be the ball of radius 10 around the origin. For the local Lipschitz constant, we set by default the radius of the input ball as ε = 0.1. In both cases, we compute the Lipschitz constant with respect to the first label.