Learning Efficient Online 3D Bin Packing on Packing Configuration Trees

Authors: Hang Zhao, Yang Yu, Kai Xu

ICLR 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Through extensive evaluation, we demonstrate that our method outperforms all existing online BPP methods and is versatile in terms of incorporating various practical constraints.
Researcher Affiliation Academia 1School of Computer Science, National University of Defense Technology, China 2National Key Lab for Novel Software Technology, Nanjing University, China
Pseudocode No The paper describes the algorithms and methods in text, but does not provide any explicitly labeled pseudocode or algorithm blocks.
Open Source Code Yes We publish the source code of our method along with related baselines at Github1. 1https://github.com/alexfrom0815/Online-3D-BPP-PCT
Open Datasets Yes To ensure that all algorithms are runnable within a reasonable period, we use the discrete dataset proposed by Zhao et al. (2021) without special declaration.
Dataset Splits No The paper mentions training and testing but does not explicitly provide specific details about validation dataset splits (percentages or counts) or refer to a standard validation split.
Hardware Specification Yes All methods are implemented in Python and tested on 2000 instances with a desktop computer equipped with a Gold 5117 CPU and a Ge Force TITAN V GPU.
Software Dependencies No The paper mentions software components like Python, Graph Attention Networks (GATs), ACKTR method, and Mask R-CNN, but does not provide specific version numbers for any of these dependencies.
Experiment Setup Yes Specifically, ACKTR runs multiple parallel processes (64 here)... To combine these data with irregular shapes into one batch, we fullfill Bt and Lt to fixed lengths, 80 and 25 |O| respectively, with dummy nodes. ... The node-wise MLPs φθB, φθL, and φθn used to embed raw space configuration nodes are two-layer linear networks with Leaky Re LU activation function. φF F is a two-layer linear structure activated by Re LU. The feature dimensions dh, dk, and dv are 64. The hyperparameter cclip used to control the range of clipped compatibility logits is set to 10 in our GAT implementation.