GloptiNets: Scalable Non-Convex Optimization with Certificates
Authors: Gaspard Beugnot, Julien Mairal, Alessandro Rudi
NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | 4 Experiments |
| Researcher Affiliation | Academia | Gaspard Beugnot gaspard.beugnot@inria.fr Inria, École normale supérieure, CNRS, PSL Research University, 75005 Paris, France; Julien Mairal julien.mairal@inria.fr Univ. Grenoble Alpes, Inria, CNRS, Grenoble INP, LJK, 38000 Grenoble, France; Alessandro Rudi alessandro.rudi@inria.fr Inria, École normale supérieure, CNRS, PSL Research University, 75005 Paris, France |
| Pseudocode | Yes | Algorithm 1: Glopti Nets |
| Open Source Code | Yes | The code to reproduce these experiments is available at github.com/gaspardbb/Glopti Nets.jl |
| Open Datasets | No | The paper describes generating synthetic data like 'random trigonometric polynomials' and 'kernel mixtures' but does not provide access information (link, DOI, citation) to a specific publicly available dataset. |
| Dataset Splits | No | The paper does not provide specific details on training, validation, or test dataset splits or a methodology for them. |
| Hardware Specification | Yes | Glopti Nets was used with NVIDIA V100 GPUs for the interpolation part, and Intel Xeon CPU E5-2698 v4 @ 2.20GHz for computing the certificate. TSSOS was run on a Apple M1 chip with Mosek solver. |
| Software Dependencies | No | The paper mentions 'Mosek solver' for TSSOS and implies Julia via the GitHub link (.jl), but does not provide specific version numbers for key software components or libraries used for Glopti Nets. |
| Experiment Setup | Yes | The parameters tuned were the type of optimizer, the decay of learning rate, and the regularization on the Frobenius norm of G. All results for Glopti Nets are obtained with confidence 1 δ = 1 e 4 98%. The number of frequencies sampled to compute the certificate is 1.6 107. |