Learning Cross-Domain Neural Networks for Sketch-Based 3D Shape Retrieval
Authors: Fan Zhu, Jin Xie, Yi Fang
AAAI 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We evaluate the effectiveness of both CDNN and PCDNN approach on the extended large-scale SHREC 2014 benchmark and compare with some other well established methods. Experimental results suggest that both CDNN and PCDNN can outperform state-of-the-art performance, where PCDNN can further improve CDNN when employing a hierarchical structure. |
| Researcher Affiliation | Academia | Fan Zhu, Jin Xie and Yi Fang NYU Multimedia and Visual Computing Lab Electrical and Computer Engineering, New York University Abu Dhabi Abu Dhabi, UAE, PO Box 129188 |
| Pseudocode | No | The paper describes the backpropagation algorithm and optimization details but does not present any structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any explicit statement about releasing source code or a link to a code repository. |
| Open Datasets | Yes | The proposed methods are evaluated on the large-scale extended SHREC 2014 sketch-based 3D shape retrieval benchmark (Li et al. 2014a). |
| Dataset Splits | Yes | The sketches are further split into the training part and the testing part, which contain 8550 and 5130 sketches respectively. We strictly follow the experimental settings in (Li et al. 2014a) and report the performance of our proposed methods by using the training dataset, the testing dataset and the complete benchmark as queries respectively. The sketch data used for training does not overlap with the data used for testing, and the 3D shape data are always the same in all runs. The results reported on the complete benchmark are obtained by 10-fold cross-validation, that the sketch data are split into 10 partitions (i.e., 8 out of 80 sketches are selected from each category in each partition), and one partition is used as testing while the remaining are used as training in each of ten rounds. |
| Hardware Specification | No | The paper does not provide any specific hardware details used for running its experiments. |
| Software Dependencies | No | The paper mentions using 'restricted Boltzmann machine (RBM)' but does not provide specific version numbers for any software dependencies. |
| Experiment Setup | Yes | The numbers of iterations for RBM and backpropagation are set as 50 and 500, respectively, and the balancing parameter λ is set as 0.001. We also fix the feature dimensions for the hidden layer and target layer as 1024. For the single-level CDNN approach, PCA is applied to the concatenated 21504-dimensional features to 1024 dimensions. |