Zero-shot Metric Learning

Authors: Xinyi Xu, Huanhuan Cao, Yanhua Yang, Erkun Yang, Cheng Deng

IJCAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments including intra-dataset transfer and inter-dataset transfer on four benchmark datasets demonstrate that ZSML can achieve state-of-the-art performance.
Researcher Affiliation Academia School of Electronic Engineering, Xidian University, Xian 710071, China xyxu.xd@gmail.com, hhcao@stu.xidian.edu.cn, yanhyang@xidian.edu.cn, {erkunyang, chdeng.xd}@gmail.com
Pseudocode Yes Algorithm 1 Learning of the proposed ZSML model.
Open Source Code No The paper does not provide an explicit statement or link for open-source code for the described methodology.
Open Datasets Yes We conduct our experiments on Caltech-101 [Wah et al., 2011b], COIL-20 [Nene et al., 1996], Image Net-201 [Deng et al., 2009], and MSRC-v1, of which the details are summarized in Table 2.
Dataset Splits Yes In intra-dataset ZSML task, we train our model on some categories and test the learned metric on the other categories which belong to the same dataset... For both of them, we randomly pick 7 classes (more than 1, 000 data points) for testing and the remaining for training.
Hardware Specification Yes It takes about 6 minutes for training on an NVIDIA TITAN X GPU, since our model only requires 26.4 million FLOPS, as summarized in Table 1.
Software Dependencies No The paper mentions 'The Caffe package [Jia et al., 2014] is used throughout the experiments.' but does not specify a version number for Caffe or any other software dependencies.
Experiment Setup Yes The base learning rate is set to 0.002 and iteration times are set to 20, 000... There are four hyper-parameters λ, k, α, and β and we set them to 0.1, 96, 0.3, and 0.7 respectively.