On the Cryptographic Hardness of Learning Single Periodic Neurons

Authors: Min Jae Song, Ilias Zadik, Joan Bruna

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical Our main result is a proof, via a reduction from a fundamental problem in lattice-based cryptography called the Shortest Vector Problem (SVP), that such learning task is hard for any polynomial-time algorithm, based on the widely-believed cryptographic assumption that (approximate) SVP is computationally intractable against quantum algorithms (See e.g., [49, 43, 19, 3] and references therein). Our result therefore extends the hardness of learning such functions from a restricted family of algorithms, such as gradient-based algorithms or SQ, to all polynomial-time algorithms by leveraging cryptographic assumptions.
Researcher Affiliation Academia Min Jae Song Courant Institute New York University minjae.song@nyu.edu Ilias Zadik Department of Mathematics Massachusetts Institute of Technology izadik@mit.edu Joan Bruna Courant Institute Center for Data Science New York University bruna@cims.nyu.edu
Pseudocode Yes Algorithm 1: LLL-based algorithm for learning the single cosine neuron.
Open Source Code No The paper does not provide any explicit statement about open-sourcing code or a link to a code repository.
Open Datasets No The paper describes generating synthetic data for theoretical analysis (e.g., 'x i.i.d. N(0, Id)'), but does not use a publicly available or open dataset with access information.
Dataset Splits No The paper focuses on theoretical analysis and sample complexity, and does not specify training/validation/test splits for empirical reproduction.
Hardware Specification No The paper discusses theoretical running times (e.g., 'running time of O(d6 log3 M)') but does not specify any hardware used for experiments.
Software Dependencies No The paper mentions the 'LLL algorithm' but does not specify any software names or versions (e.g., libraries, frameworks, or specific implementations with version numbers) used for its theoretical analysis or potential implementation.
Experiment Setup No The paper discusses theoretical conditions and sample sizes (e.g., 'β exp( (d log d)3)', 'd + 1 samples') for its algorithms and hardness results, but these are not hyperparameter settings or experimental setup details for empirical reproduction.