PROTES: Probabilistic Optimization with Tensor Sampling

Authors: Anastasiia Batsheva, Andrei Chertkov, Gleb Ryzhakov, Ivan Oseledets

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We tested it on complex multidimensional arrays and discretized multivariable functions taken, among others, from real-world applications, including unconstrained binary optimization and optimal control problems, for which the possible number of elements is up to 21000. In numerical experiments, both on analytic model functions and on complex problems, PROTES outperforms popular discrete optimization methods (Particle Swarm Optimization, Covariance Matrix Adaptation, Differential Evolution, and others).
Researcher Affiliation Academia Anastasia Batsheva Skolkovo Institute of Science and Technology Moscow, Russia a.batsheva@skoltech.ru Andrei Chertkov Skolkovo Institute of Science and Technology and AIRI, Moscow, Russia a.chertkov@skoltech.ru Gleb Ryzhakov Skolkovo Institute of Science and Technology Moscow, Russia g.ryzhakov@skoltech.ru Ivan Oseledets Skolkovo Institute of Science and Technology and AIRI, Moscow, Russia i.oseledets@skoltech.ru
Pseudocode Yes Algorithm 1 Method PROTES in the TT-format for multidimensional discrete black-box optimization
Open Source Code Yes The program code with numerical examples is available in the repository https://github.com/anabatsh/PROTES.
Open Datasets Yes We select 10 popular benchmarks: Ackley (P-01), Alpine (P-02), Exponential (P-03), Griewank (P-04), Michalewicz (P-05), Piston8 (P-06), Qing (P-07), Rastrigin (P-08), Schaffer (P-09) and Schwefel (P-10). These functions have a complex landscape and are often used in problems to evaluate the effectiveness of optimization algorithms [8, 16], including tensor-based optimizers [4, 39]. We consider the 7-dimensional case (since this is the dimension of the Piston function) and discretization on a uniform grid with 16 nodes. We consider the following QUBO problems from the qubogen package:9 Max-Cut Problem (P-11;...)
Dataset Splits No The paper mentions fixing hyperparameters: 'For all the considered optimization problems, we used the default set of parameters for baselines, and for PROTES we fixed parameters as K = 100, k = 10, kgd = 1, λ = 0.05, R = 5'. However, it does not describe a validation dataset split or a methodology for hyperparameter tuning using a separate validation set.
Hardware Specification No We also note that all computations were carried out on a regular laptop, while the operating time of the considered optimizers was commensurate; for example, for the benchmark P-17, the measured operating time (in seconds) is: PROTES (641), BS-1 (607), BS-2 (4245), BS-7 (780). This statement is too general and does not provide specific hardware details like CPU/GPU models or memory.
Software Dependencies No The paper mentions 'nevergrad framework [3]', 'qubogen package', and 'networkx package' but does not specify any version numbers for these software components.
Experiment Setup Yes For all the considered optimization problems, we used the default set of parameters for baselines, and for PROTES we fixed parameters as K = 100, k = 10, kgd = 1, λ = 0.05, R = 5 (the description of these parameters was presented in Algorithm 1). For all methods, the limit on the number of requests to the objective function was fixed at the value M = 104.