Learning Intra-Batch Connections for Deep Metric Learning
Authors: Jenny Denise Seidenschwarz, Ismail Elezi, Laura Leal-Taixé
ICML 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We achieve state-of-the-art results on clustering and image retrieval on the CUB-200-2011, Cars196, Stanford Online Products, and In-Shop Clothes datasets. To underline the effectiveness of our approach, we further present an extensive ablation study. |
| Researcher Affiliation | Academia | 1Department of Computer Science, Technical University of Munich, Munich, Germany. Correspondence to: Jenny Seidenschwarz <j.seidenschwarz@tum.de>. |
| Pseudocode | No | The paper describes its methodology using text and mathematical equations, and provides an overview figure, but does not include explicit pseudocode or a clearly labeled algorithm block. |
| Open Source Code | Yes | To facilitate further research, we make available the code and the models at https: //github.com/dvl-tum/intra_batch. |
| Open Datasets | Yes | We conduct experiments on 4 publicly available datasets using the conventional splitting protocols (Song et al., 2016): CUB-200-2011 (Wah et al., 2011)... Cars196 (Krause et al., 2013)... Stanford Online Products (SOP) (Song et al., 2016)... In-Shop Clothes (Liu et al., 2016). |
| Dataset Splits | Yes | CUB-200-2011 (Wah et al., 2011) ... For training, we use the first 100 classes and for testing the remaining classes. Cars196 (Krause et al., 2013) ... We use the first 98 classes for training and the remaining classes for testing. Stanford Online Products (SOP) (Song et al., 2016) ... We use 11, 318 classes for training and the remaining 11, 316 classes for testing. In-Shop Clothes (Liu et al., 2016) ... We use 3, 997 classes for training, while the test set, containing 3, 985 classes, is split into a query set and a gallery set. |
| Hardware Specification | Yes | All the training is done in a single Titan X GPU |
| Software Dependencies | No | The paper states: 'We implement our method in Py Torch (Paszke et al., 2017) library.' While PyTorch is named, a specific version number is not provided (e.g., PyTorch 1.x). |
| Experiment Setup | Yes | We use embedding dimension of sizes 512 for all our experiments and low temperature scaling for the softmax cross-entropy loss function... We resize the cropped image to 227 227, followed by applying a random horizontal flip. During test time, we resize the images to 256 256 and take a center crop of size 227 227. We train all networks for 70 epochs using RAdam optimizer (Liu et al., 2020)... We use small mini-batches of size 50-100. |