A PAC-Bayes Analysis of Adversarial Robustness

Authors: Paul Viallard, Eric Guillaume VIDOT, Amaury Habrard, Emilie Morvant

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In this section, we illustrate the soundness of our framework in the context of differentiable decision trees learning. We empirically illustrate that our PAC-Bayesian framework for adversarial robustness is able to provide generalization guarantees with non-vacuous bounds for the adversarial risk.
Researcher Affiliation Collaboration Paul Viallard1 , Guillaume Vidot23 , Amaury Habrard1, Emilie Morvant1 1 Univ Lyon, UJM-Saint-Etienne, CNRS, Institut d Optique Graduate School, Laboratoire Hubert Curien UMR 5516, F-42023, SAINT-ETIENNE, France 2 Airbus Opération S.A.S 3 University of Toulouse, Institut de Recherche en Informatique de Toulouse, France
Pseudocode Yes Algorithm 1 Average Adversarial Training with Guarantee
Open Source Code Yes The source code is available at https://github.com/paulviallard/NeurIPS21-PB-Robustness.
Open Datasets Yes We perform our experiment on six binary classification tasks from MNIST [Le Cun et al., 1998] (1vs7, 4vs9, 5vs6) and Fashion MNIST [Xiao et al., 2017] (Coat vs Shirt, Sandal vs Ankle Boot, Top vs Pullover).
Dataset Splits No The paper mentions learning sets S0 and S, and a test set T, but does not explicitly specify a separate validation dataset split.
Hardware Specification No The paper does not provide specific details about the hardware used for experiments, such as GPU or CPU models.
Software Dependencies No The paper mentions the use of Adam optimizer and Xavier Initializer, but does not provide specific software dependencies with version numbers (e.g., Python, PyTorch, TensorFlow versions).
Experiment Setup Yes For the two steps, we use Adam optimizer [Kingma and Ba, 2015] for T=T 0=20 epochs with a learning rate at 10 2 and a batch size at 64. We fix the number of iterations at k=20 and the step size at bk for PGD and IFGSM (where b=1 for 2-norm and b=0.1 for 1-norm).