Inverse M-Kernels for Linear Universal Approximators of Non-Negative Functions
Authors: Hideaki Kim
NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We confirm the effectiveness of our results by experiments on the problems of non-negativity-constrained regression, density estimation, and intensity estimation. |
| Researcher Affiliation | Industry | Hideaki Kim NTT Corporation hideaki.kin@ntt.com |
| Pseudocode | No | The paper does not contain any structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | Code and data to reproduce the results are available at https://github.com/Hid Kim/IM-Kernel. |
| Open Datasets | Yes | Code and data to reproduce the results are available at https://github.com/Hid Kim/IM-Kernel. |
| Dataset Splits | Yes | The hyper-parameters for each model were optimized through three-fold cross validation on a grid |
| Hardware Specification | Yes | A Mac Book Pro with 12-core CPU (Apple M2 Max) was used. |
| Software Dependencies | Yes | We implemented all compared models by using Python-3.10.8 (Sci Py-1.11, fnnls-1.0 (MIT License))1. |
| Experiment Setup | Yes | The hyper-parameters for each model were optimized through three-fold cross validation on a grid: for NCM, QNM, and IMK, the grid is (τ, r) C C for C = {0.1, 0.2, 0.5, 1, 2, 5, 10}; for SNF, the number of components for Gaussian mixture measure dµ( ) was selected from {1, 2, 3}. |