Hybrid Mamba for Few-Shot Segmentation
Authors: Qianxiong Xu, Xuanyi Liu, Lanyun Zhu, Guosheng Lin, Cheng Long, Ziyue Li, Rui Zhao
NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments have been conducted on two public benchmarks, showing the superiority of HMNet. ... 5 Experiments |
| Researcher Affiliation | Collaboration | Qianxiong Xu1, Xuanyi Liu2, Lanyun Zhu3, Guosheng Lin1 , Cheng Long1 , Ziyue Li4, Rui Zhao5 1S-Lab, Nanyang Technological University 2Peking University 3Singapore University of Technology and Design 4University of Cologne 5Sense Time Research |
| Pseudocode | No | The paper describes its methods using equations and diagrams (e.g., Figure 2, 3, 6) but does not include explicit pseudocode or algorithm blocks. |
| Open Source Code | Yes | The code is available at https://github.com/Sam1224/HMNet. |
| Open Datasets | Yes | The methods are evaluated on two benchmark datasets, including PASCAL-5i [34] and COCO-20i [28]. |
| Dataset Splits | Yes | Both of them are evenly split into four folds based on the classes, and each fold would consist of 5 and 20 classes for PASCAL-5i and COCO-20i, respectively. Then, cross validations are carried out, with each fold being taken as the test set once, while the union of other folds is adopted for training. |
| Hardware Specification | Yes | We enable DDP for model training, e.g., use 4 and 8 NVIDIA V100 GPUs for two datasets. |
| Software Dependencies | No | The paper mentions using 'Adam W' and 'SGD' optimizers but does not specify version numbers for any software dependencies like programming languages or libraries. |
| Experiment Setup | Yes | We use Adam W to optimize Mamba-related parameters [5], and SGD to optimize the remaining parameters (e.g., decoder), with their learning rates initialized as 6e-5 and 5e-3. ... the model is trained for 300 epochs on PASCAL-5i, and 75 epochs on COCO-20i, with batch size set as 8 and 16, respectively. ... all images are randomly cropped to 473 473 and 633 633 for PASCAL-5i and COCO-20i, ... We employ 8 Mamba blocks (i.e., 4 self and hybrid Mamba pairs), and set the hidden dimension as 256. |