Optimal bounds for $\ell_p$ sensitivity sampling via $\ell_2$ augmentation

Authors: Alexander Munteanu, Simon Omlor

ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical The motivation behind our work is to find a theoretical explanation for the success of sensitivity sampling and to find out whether they also achieve the optimal complexity or if there are lower bounds preventing them from achieving optimality.
Researcher Affiliation Academia 1Dortmund Data Science Center, Faculties of Statistics and Computer Science, TU Dortmund University, Dortmund, Germany 2Faculty of Statistics, TU Dortmund University, Dortmund, Germany 3Lamarr-Institute for Machine Learning and Artificial Intelligence, Dortmund, Germany. Correspondence to: Alexander Munteanu <alexander.munteanu@tudortmund.de>, Simon Omlor <simon.omlor@tu-dortmund.de>.
Pseudocode No The paper focuses on theoretical proofs and mathematical derivations; it does not include any pseudocode or algorithm blocks.
Open Source Code No The paper does not provide any statement or link indicating that source code for the described methodology is publicly available.
Open Datasets No This is a theoretical research paper that does not use datasets for empirical evaluation.
Dataset Splits No This is a theoretical research paper that does not conduct empirical experiments, and thus no dataset splits for training, validation, or testing are provided.
Hardware Specification No This is a theoretical research paper that does not conduct empirical experiments, and therefore no hardware specifications are mentioned.
Software Dependencies No This is a theoretical research paper that does not conduct empirical experiments, and therefore no specific software dependencies with version numbers are listed.
Experiment Setup No This is a theoretical research paper that does not conduct empirical experiments, and therefore no details about experimental setup or hyperparameters are provided.