On Explaining Neural Network Robustness with Activation Path
Authors: Ziping Jiang
ICLR 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | 5 EXPERIMENTS |
| Researcher Affiliation | Academia | Ziping Jiang School of Computing and Communications, Lancaster University {z.jiang7}@lancaster.ac.uk |
| Pseudocode | Yes | Algorithm 1 Smoothed Classifier with Repressed Float Path |
| Open Source Code | Yes | Code is provided at: https://github.com/Orange Bai/APCT-master |
| Open Datasets | Yes | For the CIFAR10 dataset, we compare our method with the benchmark classifier proposed by Cohen et al. (2019) with VGG16 network to show the effectiveness of our method. For the Image Net dataset, we choose Res Net50 as model architecture... |
| Dataset Splits | No | For the CIFAR10 dataset, we compare our method with the benchmark classifier proposed by Cohen et al. (2019) with VGG16 network to show the effectiveness of our method. For the Image Net dataset, we choose Res Net50 as model architecture and add the adversarial smoothed classifier Salman et al. (2019) as the base model. We also compare our model with recent works (Jeong & Shin (2020); Jeong et al. (2021)) to obtain a general evaluation. In line with previous works, we use certifiable accuracy at different radius computed by Cohen et al. (2019) as the metric. |
| Hardware Specification | No | The paper does not mention any specific hardware (e.g., GPU model, CPU type, memory) used for running the experiments. |
| Software Dependencies | No | The paper does not provide specific software dependencies with version numbers, such as 'PyTorch 1.9' or 'Python 3.8'. |
| Experiment Setup | Yes | Each of the models is trained for 200 epochs with SGD optimizer and an initial learning rate of 0.1, which decays after 60, 120, and 160 epochs with a rate of 0.2. |