On the Power and Limits of Distance-Based Learning
Authors: Periklis Papakonstantinou, Jia Xu, Guang Yang
ICML 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | This work is theoretical. It deals with general multi-class and low-distortion learning questions. |
| Researcher Affiliation | Academia | Periklis A. Papakonstantinou PERIKLIS.RESEARCH@GMAIL.COM MSIS, Business School, Rutgers University, Piscataway, NJ 08853, USA Jia Xu JIA.XU@HUNTER.CUNY.EDU Department of Computer Science, Hunter College, CUNY, 695 Park Ave, New York, NY 10065, USA & The Graduate Center, CUNY, 365 5th Ave, New York, NY 10016, USA Guang Yang GUANG.RESEARCH@GMAIL.COM Institute of Computing Technology, Chinese Academy of Sciences, Beijing, 100190, China & Aarhus University, Aaogade 34, DK 8200 Aarhus, Denmark |
| Pseudocode | No | No pseudocode or algorithm blocks are present in the paper. |
| Open Source Code | No | The paper does not provide any statement or link indicating the release of open-source code for the described methodology. |
| Open Datasets | No | This work is theoretical and does not use or describe a dataset in the context of training or evaluating models. |
| Dataset Splits | No | This is a theoretical paper and does not involve empirical experiments with dataset splits. |
| Hardware Specification | No | The paper is theoretical and does not describe any hardware specifications for running experiments. |
| Software Dependencies | No | The paper is theoretical and does not list any software dependencies with version numbers. |
| Experiment Setup | No | The paper is theoretical and does not describe an experimental setup with hyperparameters or training configurations. |