Asymmetric Mutual Alignment for Unsupervised Zero-Shot Sketch-Based Image Retrieval

Authors: Zhihui Yin, Jiexi Yan, Chenghao Xu, Cheng Deng

AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experimental results on several benchmark datasets demonstrate the superiority of our method.
Researcher Affiliation Academia 1School of Electronic Engineering, Xidian University, China 2School of Computer Science and Technology, Xidian University, China
Pseudocode No The paper describes its method verbally and mathematically, but does not include structured pseudocode or algorithm blocks.
Open Source Code No The paper does not provide any statement about making its source code publicly available or link to a code repository.
Open Datasets Yes We validate our method on three commonly used benchmark datasets in ZS-SBIR: Sketchy Ext (Liu et al. 2017), Tu Berlin Ext (Zhang et al. 2016), and Quickdraw (Dey et al. 2019).
Dataset Splits No The paper defines training and testing sets for each dataset (e.g., 'In Sketchy Ext1, we randomly selected 25 classes for testing and used the remaining 100 classes for training'), but does not explicitly mention a separate validation split.
Hardware Specification No The paper does not specify the hardware (e.g., GPU/CPU models, memory) used for running the experiments.
Software Dependencies No The paper states 'Our method is implemented using the popular Py Torch toolbox,' but does not provide specific version numbers for PyTorch or other software dependencies.
Experiment Setup Yes The initial learning rate is set to 0.0002. We train the model for 50 epochs using the SGD optimizer with a momentum of 0.9 and a batch size 32. The learning rate is gradually decayed to 0 using a cosine annealing schedule. The temperature τ is always 0.1. The filter parameter µ is 0.2. The number of multi-crops V and global views V1 were set to 10 and 2.