Communication-Aware Collaborative Learning

Authors: Avrim Blum, Shelby Heinecke, Lev Reyzin6786-6793

AAAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical In this paper, we study collaborative PAC learning with the goal of reducing communication cost at essentially no penalty to the sample complexity. We develop communication efficient collaborative PAC learning algorithms using distributed boosting. We then consider the communication cost of collaborative learning in the presence of classification noise. As an intermediate step, we show how collaborative PAC learning algorithms can be adapted to handle classification noise. With this insight, we develop communication efficient algorithms for collaborative PAC learning robust to classification noise.
Researcher Affiliation Collaboration Avrim Blum,1 Shelby Heinecke,2 Lev Reyzin3 1 Toyota Technological Institute at Chicago 2 Salesforce Research 3 University of Illinois at Chicago avrim@ttic.edu, shelby.heinecke@salesforce.com, lreyzin@uic.edu
Pseudocode Yes Algorithm 1: Personalized Learning (Blum et al. 2017) and Algorithm 2: Personalized Learning with Classification Noise are provided.
Open Source Code No The paper does not contain any statements about releasing code or links to source code repositories for the described methodology.
Open Datasets No The paper is theoretical and does not conduct experiments with specific datasets; therefore, it does not provide information about public dataset availability for its own work.
Dataset Splits No The paper is theoretical and does not describe empirical experiments or dataset splits for training, validation, or testing.
Hardware Specification No The paper does not mention any specific hardware used for running experiments.
Software Dependencies No The paper does not specify any software dependencies with version numbers.
Experiment Setup No The paper is theoretical and does not include details on experimental setup, hyperparameters, or system-level training settings.