AMSP-UOD: When Vortex Convolution and Stochastic Perturbation Meet Underwater Object Detection
Authors: Jingchun Zhou, Zongxin He, Kin-Man Lam, Yudong Wang, Weishi Zhang, Chunle Guo, Chongyi Li
AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments on the URPC and RUOD datasets demonstrate that our method outperforms existing state-of-the-art methods in terms of accuracy and noise immunity. |
| Researcher Affiliation | Academia | 1 School of Information Science and Technology, Dalian Maritime University 2 School of Computer Science and Engineering, Huizhou University 3 Department of Electrical and Electronic Engineering, Hong Kong Polytechnic University 4 School of Electrical and Information Engineering, Tianjin University, China 5 VCIP, CS, Nankai University |
| Pseudocode | No | The paper describes network architectures and processes using diagrams and mathematical equations, but it does not include formal pseudocode or algorithm blocks. |
| Open Source Code | Yes | Our code is available at: https://github.com/zhoujingchun03/AMSP-UOD. |
| Open Datasets | Yes | To showcase the generalizability of our network, we trained it on the URPC (Zhanjiang) (Liu et al. 2021) dataset, from the 2020 National Underwater Robotics Professional Competition and the extensive RUOD dataset (Fu et al. 2023). |
| Dataset Splits | Yes | The URPC dataset contains 5,543 training images across five categories, with 1,200 images from its B-list answers serving as the test set. The RUOD dataset (Fu et al. 2023) contains various underwater scenarios and consists of 10 categories. It includes 9,800 training images and 4,200 test images. |
| Hardware Specification | Yes | Our experiments run on an Intel Xeon E5-2650 v4 @ 2.20G CPU and an Nvidia Tesla V100-PCIE-16GB GPU |
| Software Dependencies | Yes | with the Ubuntu 20.04 LTS operating system and Python 3.10 environment built on Anaconda, with a network architecture based on Pytorch 2.0.1 build. |
| Experiment Setup | Yes | The hyperparameters are shown in Table 1. Type Setting Image size 640 Weights None Batch-size 16 Seeds 0 Optimizer SGD LR 0.01 Epochs 300 Early-stop True |