On Margin Maximization in Linear and ReLU Networks
Authors: Gal Vardi, Ohad Shamir, Nati Srebro
NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | This paper is purely theoretical in nature, and we do not see any potential negative societal impacts that should be discussed. (b) Did you include complete proofs of all theoretical results? [Yes] |
| Researcher Affiliation | Academia | Gal Vardi TTI-Chicago and Hebrew University galvardi@ttic.edu Ohad Shamir Weizmann Institute of Science ohad.shamir@weizmann.ac.il Nathan Srebro TTI-Chicago nati@ttic.edu |
| Pseudocode | No | The paper does not contain any structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper is theoretical and explicitly states 'N/A' for questions related to providing code for experimental results. There is no mention of releasing source code for the methodology described. |
| Open Datasets | No | The paper is purely theoretical and does not describe any empirical studies involving training on datasets. |
| Dataset Splits | No | The paper is theoretical and does not involve empirical experiments, thus no dataset split information for validation is provided. |
| Hardware Specification | No | The paper is purely theoretical and does not describe any hardware used for experiments. |
| Software Dependencies | No | The paper is purely theoretical and does not mention specific ancillary software or their version numbers. |
| Experiment Setup | No | The paper is purely theoretical and does not describe any experimental setup details such as hyperparameters or training configurations. |