Margins, Kernels and Non-linear Smoothed Perceptrons
Authors: Aaditya Ramdas, Javier Peña
ICML 2014 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We derive an accelerated smoothed algorithm with a convergence rate of log n / ρ given n separable points, which is strikingly similar to the classical kernelized Perceptron algorithm whose rate is 1/ρ^2. When no such classifier exists, we prove a version of Gordan s separation theorem for RKHSs, and give a reinterpretation of negative margins. |
| Researcher Affiliation | Academia | Aaditya Ramdas ARAMDAS@CS.CMU.EDU Carnegie Mellon University, 5000 Forbes Ave, Pittsburgh, PA 15213 USA Javier Pe na JFP@ANDREW.CMU.EDU Carnegie Mellon University, 5000 Forbes Ave, Pittsburgh, PA 15213 USA |
| Pseudocode | Yes | Algorithm 1 Perceptron ... Algorithm 2 Normalized Perceptron ... Algorithm 3 Normalized Kernel Perceptron (NKP) ... Algorithm 4 Smoothed Normalized Kernel Perceptron ... Algorithm 5 Normalized Von-Neumann (NVN) ... Algorithm 6 Smoothed Normalized Kernel Perceptron Von Neumann (SNKPV N(q, δ)) ... Algorithm 7 Iterated Smoothed Normalized Kernel Perceptron-Von Neumann (ISNKPV N(γ, ϵ)) |
| Open Source Code | No | The paper does not contain any explicit statements about releasing source code or links to a code repository for the described methodology. |
| Open Datasets | No | The paper focuses on theoretical contributions and does not mention the use of any datasets for training or evaluation. Therefore, no information about public dataset availability is provided. |
| Dataset Splits | No | The paper is theoretical and does not describe any dataset splits (training, validation, test) for reproducibility. |
| Hardware Specification | No | The paper is theoretical and does not describe any experimental setup or the specific hardware used for computations. |
| Software Dependencies | No | The paper is theoretical and does not mention any specific software dependencies with version numbers required for reproducibility. |
| Experiment Setup | No | The paper is theoretical and does not describe an experimental setup, hyperparameters, or training configurations. |