Hierarchical Class-Based Curriculum Loss
Authors: Palash Goyal, Divya Choudhary, Shalini Ghosh
IJCAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We test our loss function on real world image data sets, and show that it significantly outperforms state-of-the-art baselines. |
| Researcher Affiliation | Industry | 1Samsung Research America 2Amazon Alexa AI {palash.goyal, d.choudhary}@samsung.com, ghoshsha@amazon.com |
| Pseudocode | Yes | Algorithm 1: Class Selection for Hierarchical Class Based Curriculum Learning. |
| Open Source Code | No | The paper does not provide any statement or link indicating the release of open-source code for the described methodology. |
| Open Datasets | Yes | We evaluate our loss function on four real world image data sets (i) IMCLEF [Dimitrovski et al., 2011], (ii) Wipo [Rousu et al., 2006], (iii) Reuters [Lewis et al., 2004], and (iv) i Naturalist [Van Horn et al., 2018]. |
| Dataset Splits | Yes | We select the hyperparameters of the neural network using evaluation on a validation set with binary cross entropy loss. |
| Hardware Specification | Yes | We performed our experiments on 2 Nvidia Ge Force RTX 2080 Ti with 12 GB memory with 3.30 GHz CPU clock speed. |
| Software Dependencies | No | The paper mentions using a multi-layer perceptron and ResNet-18, and Adam optimizer, but does not provide specific version numbers for any software dependencies. |
| Experiment Setup | Yes | For evaluation on i Naturalist, we used a Res Net-18 architecture (pre-trained on Image Net). We use Adam optimizer and a learning rate of 10 5. ... Based on this, we get a structure with 800 hidden neurons and a dropout of 0.25. |