CRA-PCN: Point Cloud Completion with Intra- and Inter-level Cross-Resolution Transformers
Authors: Yi Rong, Haoran Zhou, Lixin Yuan, Cheng Mei, Jiahao Wang, Tong Lu
AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments demonstrate that our method outperforms state-of-the-art methods by a large margin on several widely used benchmarks. Codes are available at https://github.com/Easy Ry/CRA-PCN. |
| Researcher Affiliation | Academia | State Key Laboratory for Novel Software Technology, Nanjing University {rongyi, dg1833031, meicheng, wangjh}@smail.nju.edu.cn, hrzhou98@gmail.com, lutong@nju.edu.cn |
| Pseudocode | No | The paper describes methods and uses figures but does not include explicit pseudocode or algorithm blocks. |
| Open Source Code | Yes | Codes are available at https://github.com/Easy Ry/CRA-PCN. |
| Open Datasets | Yes | We train our model for 300 epochs on PCN dataset (Yuan et al. 2018) and Shape Net-55/34 (Yu et al. 2021) while 50 epochs on MVP dataset (Pan et al. 2021b). |
| Dataset Splits | No | The paper specifies training and testing sets, but does not explicitly detail validation splits with percentages or counts for all datasets. |
| Hardware Specification | Yes | All models are trained on two NVIDIA Tesla V100 graphic cards with a batch size of 72. ... All methods were evaluated on a single Nvidia Ge Force GTX 1080Ti graphic card with a batch size of 32. |
| Software Dependencies | No | We implement CRA-PCN with Pytorch (Paszke et al. 2019). The paper mentions PyTorch but does not provide a specific version number for it or other software dependencies. |
| Experiment Setup | Yes | We set the feature dim D in the decoder to 128. The initial learning rate is set to 0.001 with continuous decay of 0.1 for every 100 epochs. We train our model for 300 epochs on PCN dataset (Yuan et al. 2018) and Shape Net-55/34 (Yu et al. 2021) while 50 epochs on MVP dataset (Pan et al. 2021b). |