Weakly-Supervised Mirror Detection via Scribble Annotations
Authors: Mingfeng Zha, Yunqiang Pei, Guoqing Wang, Tianyu Li, Yang Yang, Wenbin Qian, Heng Tao Shen
AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments on three mirror datasets show that our network outperforms relevant state-of-the-art methods on all evaluation metrics and achieves performance comparable to fully supervised approaches. |
| Researcher Affiliation | Academia | Mingfeng Zha1, Yunqiang Pei1, Guoqing Wang1*, Tianyu Li1, Yang Yang1, Wenbin Qian2, Heng Tao Shen1 1University of Electronic Science and Technology of China 2Jiangxi Agricultural University |
| Pseudocode | No | The paper describes the proposed modules and their operations using mathematical formulas and text, but it does not include any pseudocode or clearly labeled algorithm blocks. |
| Open Source Code | Yes | The dataset and codes are available at https://github.com/winter-flow/WSMD. |
| Open Datasets | Yes | We collect training images from MSD, PMD, and Mirror-RGBD datasets, totaling 10,158 images, and relabel them as the training set of S-Mirror dataset. Models are evaluated using the testing sets of the above three datasets. |
| Dataset Splits | No | The paper specifies training and testing sets, but does not explicitly mention a separate validation set or its split details. |
| Hardware Specification | Yes | We implement our network using Py Torch and conduct experiments on an A100 GPU. |
| Software Dependencies | No | The paper only mentions "Py Torch" without specifying its version or other software dependencies with their respective version numbers. |
| Experiment Setup | Yes | All images are resized to 352 x 352. During the training phase, the batch size is 16, the initial learning rate is 1e-4, the decay rate is 0.9, Adam is used as the optimizer, and the epoch is 150. |