Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].
Distribution Aware VoteNet for 3D Object Detection
Authors: Junxiong Liang, Pei An, Jie Ma1583-1591
AAAI 2022 | Venue PDF | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments on both Scan Net V2 and SUN RGB-D datasets demonstrate that the proposed DAVNet achieves significant improvement and outperforms state-of-the-art 3D detectors. |
| Researcher Affiliation | Academia | Huazhong University of Science and Technology EMAIL |
| Pseudocode | No | The paper does not contain any structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any statements about releasing open-source code or links to a code repository. |
| Open Datasets | Yes | We evaluate our network on Scan Net V2 (Dai et al. 2017) and SUN RGB-D (Song, Lichtenberg, and Xiao 2015). |
| Dataset Splits | No | The paper mentions 'SUN RGB-D validation set' and 'Scan Net V2 validation set' but does not provide specific details on the train/validation/test split percentages or sample counts. |
| Hardware Specification | Yes | We conduct all our training on one GTX1080Ti GPU. |
| Software Dependencies | No | The paper mentions using an 'Adam optimizer' but does not specify version numbers for any software, libraries, or frameworks used for implementation. |
| Experiment Setup | Yes | We use an Adam optimizer to train our model in batch size 8 for both datasets. For Scan Net V2, the network is trained for 180 epochs. The learning rate is initialized as 0.01 and decreased by 10 after 120 and 160 epochs. For SUN RGBD, we train for 200 epochs with a learning rate initialized as 0.001. It is decreased by 10 after 120, 160, and 180 epochs. |