Impossibility Results for Grammar-Compressed Linear Algebra

Authors: Amir Abboud, Arturs Backurs, Karl Bringmann, Marvin Künnemann

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We apply the tools of theoretical computer science, and the recently blossoming field of fine-grained complexity, in order to shed light into the mathematical foundations of Compressed Linear Algebra. We prove new hardness reductions showing cases where the time to compute the inner product must be large (under popular complexity assumptions).
Researcher Affiliation Collaboration Amir Abboud IBM Almaden Research Center amir.abboud@gmail.com Arturs Backurs Toyota Technological Institute at Chicago backurs@ttic.edu Karl Bringmann Saarland University and Max Planck Institute for Informatics, Saarland Informatics Campus (SIC) bringmann@cs.uni-saarland.de Marvin K unnemann Max Planck Institute for Informatics, Saarland Informatics Campus (SIC) marvin@mpi-inf.mpg.de
Pseudocode No The paper does not include any pseudocode or clearly labeled algorithm blocks.
Open Source Code No This theoretical paper does not describe a new method for which source code would be released. It does not contain any statement about making code available.
Open Datasets Yes ISOLET [30] 30.94 MB 29.83 MB (0.96) 7.94 MB (0.26) US Census 1990 [30] 342.26 MB 341.97 MB (0.99) 51.91 MB (0.15)
Dataset Splits No The paper discusses theoretical impossibility results and refers to datasets only for illustrative compression rates of other methods. It does not provide specific training/validation/test dataset splits for any experimental setup performed by the authors.
Hardware Specification No The paper focuses on theoretical computer science and does not describe any experiments that would require specific hardware, thus no hardware specifications are mentioned.
Software Dependencies No The paper describes theoretical results and does not mention any specific software dependencies with version numbers for its own work.
Experiment Setup No The paper is theoretical and does not describe any experimental setup, hyperparameters, or training configurations.