Direct Parameterization of Lipschitz-Bounded Deep Networks

Authors: Ruigang Wang, Ian Manchester

ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental A comprehensive set of experiments on image classification shows that sandwich layers outperform previous approaches on both empirical and certified robust accuracy.Our experiments have two goals: First, to illustrate that our model parameterization can provide a tight Lipschitz bounds via a simple curve-fitting tasks. Second, to examine the performance and scalability of the proposed method on robust image classification tasks.
Researcher Affiliation Academia 1Australian Centre for Robotics, School of Aerospace, Mechanical and Mechatronic Engineering, The University of Sydney, Sydney, NSW 2006, Australia.
Pseudocode Yes Algorithm 1 1-Lipschitz convolutional layer
Open Source Code Yes Code is available at https://github.com/acfr/LBDN.
Open Datasets Yes We conducted a set of empirical robustness experiments on CIFAR-10/100 and Tiny-Imagenet datasets
Dataset Splits No For the curve fitting experiment, we take 300 and 200 samples (xi, yi) with xi U([ −2, 2]) for training and testing, respectively. However, for the main image classification tasks, no explicit train/validation/test splits (e.g., percentages or counts) are provided in the text.
Hardware Specification Yes All experiments were performed on an Nvidia A5000.
Software Dependencies No Pytorch code is available at https://github.com/acfr/LBDN. While PyTorch is mentioned, no specific version number for PyTorch or any other software dependency is provided.
Experiment Setup Yes For all experiments, we used a piecewise triangular learning rate (Coleman et al., 2017) with maximum rate of 0.01. We use Adam (Kingma & Ba, 2014) and Re LU as our default optimizaer and activation, respectively. ... We use batch size of 50 and Lipschitz bounds of 1, 5 and 10. ... All models are trained with normalized input data for 100 epochs.