An Empirical Study of CLIP for Text-Based Person Search
Authors: Min Cao, Yang Bai, Ziyin Zeng, Mang Ye, Min Zhang
AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | This paper makes the first attempt to conduct a comprehensive empirical study of CLIP for TBPS and thus contribute a straightforward, incremental, yet strong TBPS-CLIP baseline to the TBPS community. |
| Researcher Affiliation | Academia | 1 School of Computer Science and Technology, Soochow University 2 School of Computer Science, Wuhan University 3 Harbin Institute of Technology, Shenzhen |
| Pseudocode | No | The paper describes methods and equations but does not include explicit pseudocode or algorithm blocks. |
| Open Source Code | Yes | The code is available at https://github.com/Flame-Chasers/TBPS-CLIP. |
| Open Datasets | Yes | Comparisons with other methods are carried out on three datasets: CUHK-PEDES (Li et al. 2017b), ICFG-PEDES (Ding et al. 2021), RSTPReid (Zhu et al. 2021). |
| Dataset Splits | No | The paper mentions "few-shot capabilities (5% training data)" but does not explicitly state the full training/validation/test dataset splits for its main experiments. Details are deferred to the Appendix, which is not part of the main text analysis. |
| Hardware Specification | No | No specific hardware details (GPU models, CPU types, or cloud platforms) are mentioned for the experiments. |
| Software Dependencies | No | The paper mentions "Py Torch" but does not specify its version or any other software dependencies with version numbers. |
| Experiment Setup | Yes | The paper discusses "Training Tricks" such as global gradients back-propagation, dropout, locking bottom layers, and soft label. It also mentions hyperparameters like "τs is a hyper-parameter and set to 0.1" and "training in just 5 epochs". |