Replicable Learning of Large-Margin Halfspaces

Authors: Alkis Kalavasis, Amin Karbasi, Kasper Green Larsen, Grigoris Velegkas, Felix Zhou

ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We provide efficient replicable algorithms for the problem of learning large-margin halfspaces. Our results improve upon the algorithms provided by Impagliazzo, Lei, Pitassi, and Sorrell (STOC, 2022). We design the first dimensionindependent replicable algorithms for this task which runs in polynomial time, is proper, and has strictly improved sample complexity compared to the one achieved by Impagliazzo et al. (2022) with respect to all the relevant parameters. Moreover, our first algorithm has sample complexity that is optimal with respect to the accuracy parameter πœ–. We also design an SGDbased replicable algorithm that, in some parameters regimes, achieves better sample and time complexity than our first algorithm. Departing from the requirement of polynomial time algorithms, using the DP-to-Replicability reduction of Bun, Gaboardi, Hopkins, Impagliazzo, Lei, Pitassi, Sivakumar, and Sorrell (STOC, 2023), we show how to obtain a replicable algorithm for large-margin halfspaces with improved sample complexity with respect to the margin parameter 𝜏, but running time doubly exponential in 1/𝜏2 and worse sample complexity dependence on πœ– than one of our previous algorithms. We then design an improved algorithm with better sample complexity than all three of our previous algorithms and running time exponential in 1/𝜏2. [...] Table 1.1. Replicable Algorithms for Large-Margin Halfspaces. We denote by 𝑑the dimension, πœ–the accuracy, 𝜌the replicability, and 𝜏the margin of the halfspace. We omit the logarithmic factors on the sample complexity. See Appendix A for a more detailed comparison.
Researcher Affiliation Collaboration Alkis Kalavasis * 1 Amin Karbasi * 1 2 Kasper Green Larsen * 3 Grigoris Velegkas * 1 Felix Zhou * 1 [...] 1Yale University 2Google Research 3Aarhus University. Correspondence to: Alkis Kalavasis <alvertos.kalavasis@yale.edu>, Grigoris Velegkas <grigoris.velegkas@yale.edu>, Felix Zhou <felix.zhou@yale.edu>.
Pseudocode Yes Algorithm 1 Replicable Large-Margin Halfspaces [...] Algorithm 2 Replicable Large-Margin Halfspaces [...] Algorithm 3 Replicable Large-Margin Halfspaces
Open Source Code No The paper does not contain any statements about making source code publicly available or links to code repositories.
Open Datasets No The paper refers to drawing samples from a theoretical 'distribution π’Ÿ' but does not specify or provide access information for any publicly available or open datasets.
Dataset Splits No The paper discusses theoretical properties of algorithms like sample complexity but does not provide specific details on dataset splits (e.g., training, validation, test splits) used for empirical reproduction.
Hardware Specification No The paper is theoretical and focuses on algorithm design and analysis. It does not describe any specific hardware used for running experiments.
Software Dependencies No The paper is theoretical and describes algorithms. It does not list any specific software dependencies with version numbers.
Experiment Setup No The paper is theoretical and presents algorithm design and analysis. It does not describe a concrete experimental setup with hyperparameters or system-level training settings.