Positively Weighted Kernel Quadrature via Subsampling
Authors: Satoshi Hayakawa, Harald Oberhauser, Terry Lyons
NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Section 3 provides numerical experiments on common benchmarks. In addition to our theoretical results and the benefits resulting from convex weights, our experiments indicate that this construction can compete with the optimal bounds in well-known examples. |
| Researcher Affiliation | Academia | Mathematical Institute, University of Oxford |
| Pseudocode | Yes | Algorithm 1 Kernel Quadrature with Convex Weights via Recombination KQuad |
| Open Source Code | Yes | 1Code: https://github.com/satoshi-hayakawa/kernel-quadrature |
| Open Datasets | Yes | We used two datasets from UCI Machine Learning Repository (https://archive.ics.uci.edu/ ml/datasets/). The first is 3D Road Network Data Set [31]. The second is Combined Cycle Power Plant Data Set [32, 63]. |
| Dataset Splits | No | The paper describes sampling strategies (e.g., N points from µ, Z samples) and mentions determining hyperparameters using random subsets, but it does not specify explicit train/validation/test dataset splits with percentages or counts for reproducibility. |
| Hardware Specification | Yes | All done on a Mac Book Pro, CPU: 2.4 GHz Quad-Core Intel Core i5, RAM: 8 GB 2133 MHz LPDDR3. |
| Software Dependencies | Yes | We used the optimizer Gurobi3 Version 9.1.2 |
| Experiment Setup | Yes | We carried out the experiment for (d, r) = (1, 1), (1, 3), (2, 1), (3, 3). For each (d, r), we compared the following algorithms for n-point quadrature rules with n 2 {4, 8, 16, 32, 64, 128}. We determined λ with the median heuristic by using a random subset of X with size 10000 and used the same X and λ throughout the experiment. |