On the Convergence of Certified Robust Training with Interval Bound Propagation
Authors: Yihan Wang, Zhouxing Shi, Quanquan Gu, Cho-Jui Hsieh
ICLR 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In this paper, we present a theoretical analysis on the convergence of IBP training... We further conduct experiments to compare the convergence of networks with different widths m for natural training and IBP training respectively. |
| Researcher Affiliation | Academia | Yihan Wang*, Zhouxing Shi*, Quanquan Gu, Cho-Jui Hsieh University of California, Los Angeles {yihanwang,zshi,qgu,chohsieh}@cs.ucla.edu |
| Pseudocode | No | The paper does not contain any structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any statement or link regarding the availability of its source code. |
| Open Datasets | Yes | We use the MNIST (Le Cun et al., 2010) dataset and take digit images with label 2 and 5 for binary classification. |
| Dataset Splits | No | The paper mentions training the model but does not specify training, validation, or test dataset splits. |
| Hardware Specification | Yes | even if we enlarge m up to 80,000 limited by the memory of a single Ge Force RTX 2080 GPU |
| Software Dependencies | No | The paper does not specify any software dependencies with version numbers. |
| Experiment Setup | Yes | We train the model for 70 epochs with SGD, and we keep ϵ fixed throughout the whole training process. |