Generalization Guarantees for Sparse Kernel Approximation with Entropic Optimal Features
Authors: Liang Ding, Rui Tuo, Shahin Shahrampour
ICML 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our numerical experiments on benchmark datasets verify the superiority of EOF over the state-of-the-art in kernel approximation. |
| Researcher Affiliation | Academia | The authors are with Wm Michael Barnes 64 Department of Industrial and Systems Engineering, Texas A&M University, College Station, TX, USA. |
| Pseudocode | Yes | Algorithm 1 Entropic Optimal Features (EOF) |
| Open Source Code | No | The paper does not explicitly state that source code for the described methodology is being released or provide a link to a code repository. |
| Open Datasets | Yes | Benchmark Algorithm: We now compare EOF with the following benchmark algorithms on several datasets from the UCI Machine Learning Repository: |
| Dataset Splits | Yes | In Table 1, we report the number of training samples Ntrain and test samples Ntest used for each dataset. |
| Hardware Specification | Yes | The run time is obtained on a Macbook Pro with a 4-core, 3.3 GHz Intel Core i5 CPU and 8 GB of RAM (2133Mhz). |
| Software Dependencies | No | The paper mentions 'Matlab' in the context of efficient matrix operations but does not provide specific version numbers for any software dependencies. |
| Experiment Setup | No | The paper describes data standardization and selection of feature sets, but does not provide specific hyperparameter values for model training, such as learning rates, batch sizes, or optimizer settings. |