Model Compression with Adversarial Robustness: A Unified Optimization Framework
Authors: Shupeng Gui, Haotao Wang, Haichuan Yang, Chen Yu, Zhangyang Wang, Ji Liu
NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | To demonstrate that ATMC achieves remarkably favorable trade-offs between robustness and model compactness, we carefully design experiments on a variety of popular datasets and models as summarized in Section 3.1. |
| Researcher Affiliation | Collaboration | Department of Computer Science, University of Rochester Department of Computer Science and Engineering, Texas A&M University Ytech Seattle AI lab, Fe DA lab, AI platform, Kwai Inc |
| Pseudocode | Yes | Algorithm 1 Zero Kmeans B( U) and Algorithm 2 ATMC |
| Open Source Code | Yes | The codes are publicly available at: https://github.com/shupenggui/ATMC. |
| Open Datasets | Yes | Le Net on the MNIST dataset [59]; Res Net34 [60] on CIFAR-10 [61] and CIFAR-100 [61]; and Wide Res Net [62] on SVHN [63]. |
| Dataset Splits | No | The paper mentions training and testing sets, but does not explicitly provide details about a distinct validation dataset split or its size/percentage. |
| Hardware Specification | No | The paper does not specify the hardware used for running the experiments, such as GPU models, CPU types, or memory details. |
| Software Dependencies | No | The paper mentions using PyTorch implicitly through common deep learning practices but does not specify exact versions for software dependencies like PyTorch, Python, or CUDA. |
| Experiment Setup | Yes | Unless otherwise specified, we set the perturbation magnitude to be 76 for MNIST and 4 for the other three datasets. (The color scale of each channel is between 0 and 255.) Following the settings in [32], we set PGD attack iteration numbers n to be 16 for MNIST and 7 for the other three datasets. We follow [30] to set PGD attack step size α to be min( + 4, 1.25 )/n. We train ATMC for 50, 150, 150, 80 epochs on MNIST, CIFAR10, CIFAR100 and SVHN respectively. |