Asymmetric Deep Supervised Hashing
Authors: Qing-Yuan Jiang, Wu-Jun Li
AAAI 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments on three large-scale datasets show that ADSH can achieve state-of-the-art performance in real applications. |
| Researcher Affiliation | Academia | Qing-Yuan Jiang, Wu-Jun Li National Key Laboratory for Novel Software Technology Collaborative Innovation Center of Novel Software Technology and Industrialization Department of Computer Science and Technology, Nanjing University, China jiangqy@lamda.nju.edu.cn,liwujun@nju.edu.cn |
| Pseudocode | Yes | Algorithm 1 The learning algorithm for ADSH |
| Open Source Code | No | The paper mentions that source code for some baselines is provided by their authors, but there is no statement or link indicating that the authors of ADSH are providing their code. |
| Open Datasets | Yes | We evaluate ADSH on three datasets: MS-COCO (Lin et al. 2014b), CIFAR-10 (Krizhevsky 2009) and NUS-WIDE (Chua et al. 2009). |
| Dataset Splits | Yes | tune the learning rate among [10-6, 10-2] by using a validation set. For ADSH method, we set γ = 200, Tout = 50, Tin = 3, |Ω| = 2000 by using a validation strategy for all datasets. |
| Hardware Specification | Yes | We carry out experiments to evaluate our ADSH and baselines which are implemented with the deep learning toolbox Mat Conv Net (Vedaldi and Lenc 2015) on a NVIDIA M40 GPU server. |
| Software Dependencies | No | The paper mentions 'deep learning toolbox Mat Conv Net (Vedaldi and Lenc 2015)' but does not provide specific version numbers for software dependencies. |
| Experiment Setup | Yes | In order to avoid overfitting, we set weight decay as 5e-4. Following the suggestions of the authors, we set the mini-batch size to be 128 and tune the learning rate among [10-6, 10-2] by using a validation set. For ADSH method, we set γ = 200, Tout = 50, Tin = 3, |Ω| = 2000 by using a validation strategy for all datasets. |