The Fast Johnson-Lindenstrauss Transform Is Even Faster
Authors: Ora Nova Fandina, Mikael Møller Høgsgaard, Kasper Green Larsen
ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | In this work, we give a surprising new analysis of the Fast JL transform, showing that the k ln2 n term in the embedding time can be improved to (k ln2 n)/α for an α = Ω(min{ε 1 ln(1/ε), ln n}). The improvement follows by using an even sparser matrix. We complement our improved analysis with a lower bound showing that our new analysis is in fact tight. |
| Researcher Affiliation | Academia | 1Department of Computer Science, Aarhus University, Aarhus, Denmark. |
| Pseudocode | No | No pseudocode or algorithm blocks were found. The paper primarily focuses on theoretical analysis, proofs, and mathematical derivations. |
| Open Source Code | No | The paper does not contain any statements about making source code open-source or providing links to a code repository. |
| Open Datasets | No | The paper is theoretical and does not describe experiments involving datasets. Therefore, no information about publicly available training datasets is provided. |
| Dataset Splits | No | The paper is theoretical and does not describe empirical experiments or dataset usage. Thus, no information regarding training, validation, or test splits is provided. |
| Hardware Specification | No | The paper is theoretical and does not describe empirical experiments. Therefore, no hardware specifications for running experiments are mentioned. |
| Software Dependencies | No | The paper is theoretical and focuses on mathematical analysis. It does not mention any specific software dependencies or versions required to replicate its findings. |
| Experiment Setup | No | The paper is theoretical and does not describe any empirical experiments or their setup. Therefore, no experimental setup details like hyperparameters or training settings are provided. |