FINet: Dual Branches Feature Interaction for Partial-to-Partial Point Cloud Registration
Authors: Hao Xu, Nianjin Ye, Guanghui Liu, Bing Zeng, Shuaicheng Liu2848-2856
AAAI 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments demonstrate that our method performs higher precision and robustness compared to the state-of-the-art traditional and learning-based methods. |
| Researcher Affiliation | Collaboration | 1University of Electronic Science and Technology of China 2Megvii Technology |
| Pseudocode | No | The paper describes the methodology using textual descriptions and mathematical formulas, but does not include any explicit pseudocode or algorithm blocks. |
| Open Source Code | Yes | Code is available at https://github.com/megvii-research/FINet. |
| Open Datasets | Yes | Model Net40. (Wu et al. 2015) includes CAD models from 40 object categories. We use the data from OMNet (Xu et al. 2021), where 8 axisymmetrical categories are removed to avoid the ill-posed problem. 7Scenes. (Shotton et al. 2013) is a widely used benchmark where data is captured by a Kinect camera in 7 indoor scenes. |
| Dataset Splits | Yes | We use the official train/test splits, resulting in 4,196 training, 1,002 validation, and 1,146 test objects. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running the experiments. |
| Software Dependencies | No | The paper mentions the use of Adam optimizer but does not specify versions of any software dependencies like programming languages, libraries, or frameworks. |
| Experiment Setup | Yes | We run 4 iterations of alignment. Adam optimizer (Kingma and Ba 2015) is used with lr = 10^-4. The batch size is 64, and training for 260k steps. ... The dropout ratio is set to 0.3. ... the factor λ is empirically set to 4.0 in all our experiments. ... the factors β and γ are set to 10^-3. |