Dynamic Byzantine-Robust Learning: Adapting to Switching Byzantine Workers
Authors: Ron Dorfman, Naseem Amin Yehya, Kfir Yehuda Levy
ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Finally, in Section 6, we explore the practical aspects and benefits of our approach through experiments on image classification tasks with two dynamic identity-switching strategies. |
| Researcher Affiliation | Academia | 1Department of Electrical and Computer Engineering, Technion, Haifa, Israel. |
| Pseudocode | Yes | Algorithm 1 Byzantine-Robust Optimization with MLMC |
| Open Source Code | No | The paper does not contain any statement or link providing concrete access to source code for the methodology described. |
| Open Datasets | Yes | We study image classification on the MNIST (Le Cun et al., 1998) and CIFAR-10 (Krizhevsky et al., 2009) datasets |
| Dataset Splits | No | The paper mentions using MNIST and CIFAR-10 datasets and reports test accuracy, but it does not explicitly provide the training, validation, and test dataset splits or their sizes. |
| Hardware Specification | Yes | We run all experiments on a machine with a single NVIDIA Ge Force RTX 4090 GPU. |
| Software Dependencies | No | The paper mentions using CNN architectures and training details but does not provide specific version numbers for any software components or libraries like Python, PyTorch, or CUDA. |
| Experiment Setup | Yes | Additional training details are deferred to Appendix J for brevity. ... Table 2. Training details and hyperparameters. ... Learning rate 10 drop after 4000 iterations 10 drop after 6000 iterations Weight decay 10 4 Base batch size 32 64 |