How to Use the Metropolis Algorithm for Multi-Objective Optimization?

Authors: Weijie Zheng, Mingfeng Li, Renzhong Deng, Benjamin Doerr

AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental This paper takes several steps towards understanding, again via theoretical means, whether such advantages can also be obtained in multi-objective optimization. Overall, our work suggests that the general mechanism of the Metropolis algorithm can be interesting in multi-objective optimization, but that the implementation details can have a huge impact on the performance. We conduct a small experimental investigation and observe that this variant with standard bit-wise mutation achieves similar performance as the GSEMO, NSGA-II, and SMS-EMOA (for which we have shown a runtime guarantee of O(n4)).
Researcher Affiliation Academia 1 School of Computer Science and Technology, International Research Institute for Artificial Intelligence, Harbin Institute of Technology, Shenzhen, China 2 Laboratoire d Informatique (LIX), Ecole Polytechnique, CNRS, Institut Polytechnique de Paris, Palaiseau, France
Pseudocode Yes Algorithm 1: RLS to maximize f : {0, 1}n R; Algorithm 2: The Metropolis algorithm with acceptance parameter α > 1 to maximize f : {0, 1}n R; Algorithm 3: SEMO to maximize f : {0, 1}n Rm; Algorithm 4: Metropolis algorithm with parent replacement and one-bit mutation to maximize function f = (f1, f2); Algorithm 5: Metropolis algorithm with parent replacement and standard bit-wise mutation to maximize f = (f1, f2); Algorithm 6: NSGA-II to maximize f : {0, 1}n Rm; Algorithm 7: SMS-EMOA to maximize f : {0, 1}n Rm; Algorithm 8: Metropolis algorithm with keeping the parent to maximize f = (f1, f2)
Open Source Code No No explicit statement or link providing concrete access to the source code for the methodology described in the paper.
Open Datasets No The paper introduces a new theoretical benchmark called DLTB but does not refer to it as a publicly available dataset with concrete access information (link, DOI, formal citation).
Dataset Splits No No specific dataset split information (percentages, sample counts, citations to predefined splits) for training, validation, or testing is provided. The paper defines a theoretical benchmark problem and runs experiments on it without traditional dataset splits.
Hardware Specification No No specific hardware details (e.g., CPU, GPU models, memory, or cloud instance types) used for running the experiments are mentioned in the paper.
Software Dependencies No The paper mentions algorithms like GSEMO, NSGA-II, and SMS-EMOA but does not provide specific software dependencies with version numbers (e.g., Python, PyTorch, or other libraries).
Experiment Setup Yes We set α = 3 as in the paper (Wang, Zheng, and Doerr 2024) of the single-objective Metropolis algorithm, set N = 4n (at least 4(n 1) as suggested in Theorem 16), fair selection, and standard bit-wise mutation for NSGA-II, and set µ = n 1 (at least n 1 as suggested in Theorem 18).