Depth Privileged Object Detection in Indoor Scenes via Deformation Hallucination
Authors: Zhijie Zhang, Yan Liu, Junjie Chen, Li Niu, Liqing Zhang3456-3464
AAAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experimental results on NYUDv2 and SUN RGB-D demonstrate the effectiveness of our method against the state-of-the-art methods for depth privileged object detection. |
| Researcher Affiliation | Academia | Mo E Key Lab of Artificial Intelligence, Department of Computer Science and Engineering, Shanghai Jiao Tong University {zzj506506, loseover, chen.bys, ustcnewly}@sjtu.edu.cn, zhang-lq@cs.sjtu.edu.cn |
| Pseudocode | No | The paper describes its methods in text and diagrams, but does not include any structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any explicit statement about releasing its source code or a link to a code repository. |
| Open Datasets | Yes | NYU Depth V2 (NYUDv2) (Silberman et al. 2012) consists of 1449 paired RGB-D images. SUN RGB-D (Song, Lichtenberg, and Xiao 2015) is composed of an official train/test split with 5285 and 5050 images, respectively. |
| Dataset Splits | Yes | The dataset [NYUDv2] is split into training (795 images) and test (654 images) sets. SUN RGB-D (Song, Lichtenberg, and Xiao 2015) is composed of an official train/test split with 5285 and 5050 images, respectively. |
| Hardware Specification | Yes | All experiments are conducted on Ubuntu 18.04 with two 8GB Ge Force RTX 2080 SUPER, 16GB Intel 9700K, and Py Torch 1.2.0 on Python 3.7. |
| Software Dependencies | Yes | All experiments are conducted on Ubuntu 18.04 with two 8GB Ge Force RTX 2080 SUPER, 16GB Intel 9700K, and Py Torch 1.2.0 on Python 3.7. |
| Experiment Setup | Yes | We train our model using the SGD optimizer for 50k iterations for D-branch pre-training and the whole model training. The basic learning rate is initialized to 1 10 3 and reduced to 1 10 4 when the iterations reach 40k. The weight decay and momentum are set to 5 10 4 and 0.9, respectively. The random seed is set to 222. two trade-off parameters α and β are set as 1.0 and 2.0, respectively. δ is a hyper-parameter controlling the intensity of avoiding negative transfer and set as 0.25 via cross-validation. µ is a trade-off parameter and set as 0.1 via corss-validation. |