Robust Deep Co-Saliency Detection with Group Semantic

Authors: Chong Wang, Zheng-Jun Zha, Dong Liu, Hongtao Xie8917-8924

AAAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experimental results on COCO-SEG and a widely used benchmark Cosal2015 have demonstrated the superiority of the proposed approach as compared to the state-of-the-art methods.
Researcher Affiliation Academia 1Institute of Intelligent Machines, Chinese Academy of Sciences 2University of Science and Technology of China
Pseudocode No The paper describes its approach conceptually and through figures but does not provide pseudocode or a formal algorithm block.
Open Source Code No The paper does not provide any explicit statement about making its source code publicly available or a link to a code repository.
Open Datasets Yes we construct a new large-scale dataset, i.e., COCO-SEG, at a large scale. COCO-SEG is selected from the COCO2017 dataset (Lin et al. 2014)
Dataset Splits No The resultant COCO-SEG dataset contains 200K images belonging to 78 groups for training and 8K images of 78 groups for testing. All the models are trained on the training set of COCO-SEG and evaluated on the COCO-SEG validation set and Cosal2015. - It doesn't specify the size or exact split of the validation set from the training data.
Hardware Specification No The paper does not provide specific hardware details (e.g., GPU/CPU models, memory amounts) used for running its experiments.
Software Dependencies No The paper mentions VGG19 and Adam Algorithm but does not provide specific version numbers for software dependencies or libraries.
Experiment Setup Yes All images and groundtruth maps are resized to 224 224. The proposed models are optimized by the Adam Algorithm (Kingma and Ba 2014), in which the exponential decay rates for the first and second monent estimates are set to 0.9 and 0.999 respectively. The learning rate starts from 1e-5, and reduces by half every 10,000 steps until the model converges at about 50,000 steps.