Optimality Implies Kernel Sum Classifiers are Statistically Efficient
Authors: Raphael Meyer, Jean Honorio
ICML 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We show some experimental results that verify our core theorem, i.e. Theorem 3. Our experiment uses 8 fixed kernels from several kernels families. ... We generate n = 300 samples in R50. ... After solving each SVM problem, we keep track of the value of α Σ,m KΣ,mαΣ,m value. We then plot this value against the two bounds provided by Theorem 3 |
| Researcher Affiliation | Academia | Raphael A. Meyer 1 Jean Honorio 1 1Department of Computer Science, Purdue University, Indiana, USA. |
| Pseudocode | No | The paper provides mathematical definitions, theorems, and proofs, but does not include any explicitly labeled pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not contain any explicit statements about making the source code for their methodology publicly available, nor does it provide links to a code repository. |
| Open Datasets | No | The paper states 'All our data is generated from a mixture of 4 Gaussians. We generate n = 300 samples in R50.', indicating synthetic data generation rather than the use of a publicly available dataset with concrete access information or formal citation. |
| Dataset Splits | No | The paper describes an experiment with 'n = 300 samples' but does not specify any training, validation, or test dataset splits or cross-validation setup. |
| Hardware Specification | No | The paper describes the data generation and experimental procedure but does not provide any specific details about the hardware (e.g., CPU, GPU models) used to run the experiments. |
| Software Dependencies | No | The paper mentions methods like Kernel SVM but does not specify any software names with version numbers (e.g., Python, PyTorch, TensorFlow versions, or solver versions) used for implementation. |
| Experiment Setup | Yes | Our experiment uses 8 fixed kernels from several kernels families. We have 5 radial basis kernels, 1 linear kernel, 1 polynomial kernel, and 1 cosine kernel. ... We generate n = 300 samples in R50. For each of the 8 base kernels, we solve the Dual Kernel SVM problem, and empirically verify that α t Ktαt 320 = B2. |