DARI: Distance Metric and Representation Integration for Person Verification

Authors: Guangrun Wang, Liang Lin, Shengyong Ding, Ya Li, Qing Wang

AAAI 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental On several public datasets, DARI shows very promising performance on re-identifying individuals cross cameras against various challenges, and outperforms other state-of-the-art approaches.
Researcher Affiliation Academia Guangrun Wang, Liang Lin , Shengyong Ding, Ya Li and Qing Wang School of Data and Computer Science, Sun Yat-sen University, Guangzhou 510006, China wanggrun@mail2.sysu.edu.cn, linliang@ieee.org, marcding@163.com, liya@gzhu.edu.cn, ericwangqing@gmail.com
Pseudocode Yes Algorithm 1 Learning DARI with batch-process; Algorithm 2 Calculating gradients for optimization
Open Source Code No The paper states that the implementation is based on the Caffe framework, but does not provide a link or explicit statement about the availability of their specific source code.
Open Datasets Yes We conduct our experiments using three challenging human verification datasets, i.e. CUHK03(Li et al. 2014), CUHK01(Li, Zhao, and Wang 2012) and i LIDS(Zheng, Gong, and Xiang 2013) .
Dataset Splits No The paper describes training and testing splits for its datasets, but does not explicitly mention a separate "validation" dataset split used for hyperparameter tuning or early stopping.
Hardware Specification Yes We execute the code on a PC with GTX780 GPU and quad-core CPU.
Software Dependencies No The paper mentions implementing the algorithm based on the "Caffe framework" but does not provide specific version numbers for Caffe or any other software dependencies.
Experiment Setup Yes The weights of the filters and the full connection parameters are initialized from two zero-mean Gaussian distributions with standard deviation 0.01 and 0.001 respectively. The bias terms were set with the constant 0. During the training, we select 60 persons to construct 4800 triplets in each iteration.