Weak-shot Semantic Segmentation via Dual Similarity Transfer
Authors: Junjie Chen, Li Niu, Siyuan Zhou, Jianlou Si, Chen Qian, Liqing Zhang
NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Comprehensive experiments on the challenging COCO-Stuff-10K and ADE20K datasets demonstrate the effectiveness of our method. |
| Researcher Affiliation | Collaboration | Junjie Chen1, Li Niu1 , Siyuan Zhou1, Jianlou Si2, Chen Qian2, Liqing Zhang1 1The Mo E Key Lab of AI, CSE department, Shanghai Jiao Tong University 2Sense Time Research, Sense Time |
| Pseudocode | No | The paper does not contain structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | Codes are available at https://github.com/bcmi/Sim Former-Weak-Shot-Semantic-Segmentation. |
| Open Datasets | Yes | Comprehensive experiments on the challenging COCO-Stuff-10K and ADE20K datasets demonstrate the effectiveness of our method. [...] COCO-Stuff10K [3] contains 9k training images and 1k test images, covering 171 semantic classes. ADE20K [50] has 20k training images and 2k validating images, covering 150 semantic classes. |
| Dataset Splits | Yes | COCO-Stuff10K [3] contains 9k training images and 1k test images, covering 171 semantic classes. ADE20K [50] has 20k training images and 2k validating images, covering 150 semantic classes. |
| Hardware Specification | No | The paper does not provide specific hardware details (exact GPU/CPU models, processor types with speeds, memory amounts, or detailed computer specifications) used for running its experiments. |
| Software Dependencies | No | The paper does not provide specific ancillary software details with version numbers (e.g., library or solver names with version numbers like Python 3.8, CPLEX 12.4) needed to replicate the experiment. |
| Experiment Setup | Yes | We use α (0.1 by default) for balancing the distillation loss and β (0.2 by default) for balancing the complementary loss. |