Recursive Sketches for Modular Deep Learning
Authors: Badih Ghazi, Rina Panigrahy, Joshua Wang
ICML 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | Most of our analysis revolves around proving and utilizing properties of random matrices. and Proving procedure correctness is done via probabilistic inequalities and analysis tools, including the Khintchine inequality and the Hanson-Wright inequality. and Our theoretical results will primarily focus on a single path from an object θ to the output object. and We prove that (under certain assumptions) augmenting the teacher network with our sketches can theoretically make it easier to do so. |
| Researcher Affiliation | Industry | 1Google Research, Mountain View, CA, USA.. Correspondence to: Rina Panigrahy <rinap@google.com>. |
| Pseudocode | Yes | Algorithm 1 Enc(j, bm, bs) and Algorithm 2 D(b, q, dsketch) |
| Open Source Code | No | The paper does not provide any statement or link indicating that the source code for the described methodology is publicly available. |
| Open Datasets | No | The paper is theoretical and does not describe the use of any datasets for training or evaluation. |
| Dataset Splits | No | The paper is theoretical and does not describe experiments that would involve dataset splits. |
| Hardware Specification | No | The paper is theoretical and does not describe running experiments, therefore no hardware specifications are mentioned. |
| Software Dependencies | No | The paper is theoretical and does not mention specific software components or libraries with version numbers. |
| Experiment Setup | No | The paper is theoretical and does not describe an experimental setup with specific hyperparameters or training configurations. |