PHFormer: Multi-Fragment Assembly Using Proxy-Level Hybrid Transformer
Authors: Wenting Cui, Runzhao Yao, Shaoyi Du
AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experimental results demonstrate that our method effectively reduces assembly errors while maintaining fast inference speed. |
| Researcher Affiliation | Academia | National Key Laboratory of Human-Machine Hybrid Augmented Intelligence, National Engineering Research Center for Visual Information and Applications, and Institute of Artificial Intelligence and Robotics, Xi an Jiaotong University |
| Pseudocode | No | The paper describes its method using textual descriptions and equations but does not include any explicitly labeled pseudocode or algorithm blocks. |
| Open Source Code | Yes | The code is available at https://github.com/521piglet/PHFormer. |
| Open Datasets | Yes | We conduct experiments on a large-scale fractured object dataset, Breaking Bad (Sell an et al. 2022) |
| Dataset Splits | No | For each sub-dataset, 80% objects are leveraged for training, and the remaining 20% objects for testing. A distinct validation split is not explicitly mentioned. |
| Hardware Specification | Yes | We implement our approach using Py Torch (Paszke et al. 2019) on two NVIDIA RTX3090 GPUs. |
| Software Dependencies | No | The paper mentions 'Py Torch (Paszke et al. 2019)' and 'Adam optimizer (Kingma and Ba 2014)' but does not specify version numbers for these software dependencies, which is required for reproducibility. |
| Experiment Setup | Yes | The Adam optimizer (Kingma and Ba 2014) is adopted for training, with initial learning rate 5e-4, terminated learning rate 5e-6, and a cosine scheduler (Loshchilov and Hutter 2016). We train the model 400 epochs with 20 epochs for warm-up. The batch size is set to 32. For the hyper-parameters in the loss function, λadj, λrot, λpose , λcham , λL2 are set to 1, 0.2, 1, 10, and 1, respectively. |