Zero-Shot Learning from Adversarial Feature Residual to Compact Visual Feature
Authors: Bo Liu, Qiulei Dong, Zhanyi Hu11547-11554
AAAI 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experimental results on six benchmark datasets demonstrate that our method could achieve a significantly better performance than existing state-of-the-art methods by 1.2-13.2% in most cases. |
| Researcher Affiliation | Academia | 1National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences 2University of Chinese Academy of Sciences 3Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences |
| Pseudocode | No | The paper describes the methodology using text and mathematical equations, but it does not contain a structured pseudocode or algorithm block. |
| Open Source Code | No | The paper does not provide an explicit statement about releasing source code or a link to a code repository for the described methodology. |
| Open Datasets | Yes | The proposed method is evaluated on the following six public datasets: Caltech USCD Birds-2011 (CUB) (Wah et al. 2011), North America Birds (NAB) (Van Horn et al. 2015), APascal-a Yahoo (APY) (Farhadi et al. 2009), Animals with Attributes (AWA1) (Lampert, Nickisch, and Harmeling 2013), renewed Animals with Attributes (AWA2) (Xian et al. 2018a) and SUN attributes (SUN) (Patterson and Hays 2012). |
| Dataset Splits | Yes | data split has a huge impact on performance. As suggested by (Elhoseiny et al. 2017), on CUB and NAB, we evaluate the proposed method via SCS-split and SCE-split. ... Following (Xian et al. 2018a), we evaluate the proposed method on the APY, AWA1, AWA2 and SUN datasets with PS-split. The detailed split information is reported in Table 1. |
| Hardware Specification | No | The paper does not provide any specific hardware details such as GPU or CPU models, memory, or cloud instance specifications used for running the experiments. |
| Software Dependencies | No | The paper mentions using SVR with RBF kernel, multi-layer perceptrons (MLP) with ReLU activation, and Wasserstein GAN (WGAN), but it does not provide specific version numbers for these software components or any other libraries. |
| Experiment Setup | Yes | In the proposed method, prototype prediction is implemented by SVR with RBF kernel. The generator and discriminator are both three-layer MLP with Re LU activation, which both employ 4096 units in hidden layer. Hyper-parameters in WGAN are set as they are suggested by the author. |