Gliding over the Pareto Front with Uniform Designs
Authors: Xiaoyuan Zhang, Genghui Li, Xi Lin, Yichi Zhang, Yifan Chen, Qingfu Zhang
NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments on synthetic and real-world benchmarks demonstrate that our proposed paradigm efficiently produces high-quality, representative solutions and outperforms baseline MOO methods. |
| Researcher Affiliation | Academia | Xiaoyuan Zhanga, Genghui Li b, Xi Lin a, Yichi Zhangc, Yifan Chend, Qingfu Zhanga a Department of Computer Science, City University of Hong Kong; b College of Computer Science and Software Engineering, Shenzhen University c Department of Statistics, Indiana University Bloomingtom d Departments of Mathematics and Computer Science, Hong Kong Baptist University |
| Pseudocode | Yes | Due to the space limit, the pseudo-codes of UMOD are provided in Algorithm 1 and Algorithm 2 in Appendix C.1. |
| Open Source Code | Yes | The source code is integrated into the Lib MOON library, available at https://github.com/xzhang2523/libmoon. |
| Open Datasets | Yes | We conduct comparative evaluations against methods on complex multiobjective problems with numerous local optimas and on fairness classification problems with thousands of decision variables. ... We compare our method against other gradient-based MOO methods on multiobjective fairness tasks involving the Adult [2] and Compass [1] datasets. |
| Dataset Splits | No | The paper describes the use of various datasets and benchmarks for evaluation but does not specify explicit train/validation/test data splits, proportions, or sample counts for these datasets. |
| Hardware Specification | Yes | The system used features an Intel Core i7-10700 CPU and a NVIDIA RTX 3080 GPU. |
| Software Dependencies | No | Our method is implemented in the MOEA/D framework using Pymoo [7]... The paper mentions Pymoo and other software components but does not provide specific version numbers for these dependencies. |
| Experiment Setup | Yes | Table 5: Hyper-parameters used in UMOD-MOEA. This table lists specific values for crossover, mutation, learning rate, training epoch, number of preferences, and other relevant settings. |