Learning to solve Class-Constrained Bin Packing Problems via Encoder-Decoder Model
Authors: Hanni Cheng, Ya Cong, Weihao Jiang, Shiliang Pu
ICLR 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments demonstrate that our proposed method consistently yields high-quality solutions for various kinds of CCBPP with a very small gap from the optimal. Moreover, our Encoder-Decoder Model also shows promising performance on one practical application of CCBPP, the Manufacturing Order Consolidation Problem (OCP). |
| Researcher Affiliation | Industry | Hikvision Research Institute, Hangzhou 310051, China {chenghanni, congya, jiangweihao5, pushiliang.hri}@hikvision.com |
| Pseudocode | Yes | Appendix C contains Algorithm 1 Cluster Decode, and Appendix D.1 contains Algorithm 2 Dataset Construction. |
| Open Source Code | No | The paper mentions 'scikitopt' with a GitHub link in a footnote, but this refers to a baseline implementation, not the authors' own code. No explicit statement or link is provided for the open-sourcing of their Encoder-Decoder Model code. |
| Open Datasets | No | Since there are no known benchmarks for the CCBPP available in the literature, the datasets are generated following the convention of Borges et al. (2020) and da Silva & Schouery (2023)... Details of data generation are described in Appendix D.1. For OCP, 'our datasets consists of the synthetic dataset with ground-truth and real supply chain dataset in September 2022.' The paper describes how data was generated but does not provide specific access (link, citation for dataset itself, etc.) to the generated datasets. |
| Dataset Splits | No | Section 5.1 mentions '6400 training instances and 200 test instances are generated', but no explicit mention of a separate validation split or how it's handled. |
| Hardware Specification | Yes | The experiments are conducted on a Linux server with Ge Force RTX 3090 GPU and AMD EPYC 7542 32-Core Processor CPU@2.9GHz. |
| Software Dependencies | No | The paper mentions 'scikitopt', 'OR-tools', and 'gurobi' as software used, but does not provide specific version numbers for any of these or other key software components used for reproducibility. |
| Experiment Setup | Yes | Appendix D.2, titled 'EXPERIMENT HYPER-PARAMETERS', provides specific details such as 'the number of GCN layers Lgcn is 3, and the number of MLP layers Lmlp is 3. The learning rate lr is 5e 5 with 1e 5 weight decay, the 0 in edge features are replaced by ϵ = 0.3, and the loss balance coefficient λ = 0.3 . Our network is trained for 20 epochs.' and other parameters for Active Search, GA, ACO, and Point Net. |