Defending Against Saddle Point Attack in Byzantine-Robust Distributed Learning
Authors: Dong Yin, Yudong Chen, Ramchandran Kannan, Peter Bartlett
ICML 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We develop Byzantine PGD, a robust first-order algorithm that can provably escape saddle points and fake local minima, and converge to an approximate true local minimizer with low iteration complexity. As a by-product, we give a simpler algorithm and analysis for escaping saddle points in the usual non Byzantine setting. We further discuss three robust gradient estimators that can be used in Byzantine PGD, including median, trimmed mean, and iterative filtering. We characterize their performance in concrete statistical settings, and argue for their near-optimality in low and high dimensional regimes. |
| Researcher Affiliation | Academia | 1Department of Electrical Engineering and Computer Sciences, UC Berkeley, Berkeley, CA, USA 2School of Operations Research and Information Engineering, Cornell University, Ithaca, NY, USA 3Department of Statistics, UC Berkeley, Berkeley, CA, USA. |
| Pseudocode | Yes | Algorithm 1. Byzantine Perturbed Gradient Descent (Byzantine PGD) |
| Open Source Code | No | The paper does not mention releasing any source code or provide links to a code repository. |
| Open Datasets | No | The paper is theoretical and does not use or describe specific public datasets for training. |
| Dataset Splits | No | The paper is theoretical and does not describe dataset splits for training, validation, or testing. |
| Hardware Specification | No | The paper is theoretical and does not mention any specific hardware used for experiments. |
| Software Dependencies | No | The paper is theoretical and does not specify any software dependencies with version numbers. |
| Experiment Setup | No | The paper is theoretical and focuses on algorithm design and theoretical guarantees, not on experimental setup details like hyperparameters. |