IRPruneDet: Efficient Infrared Small Target Detection via Wavelet Structure-Regularized Soft Channel Pruning

Authors: Mingjin Zhang, Handi Yang, Jie Guo, Yunsong Li, Xinbo Gao, Jing Zhang

AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Through extensive experiments on two widely-used benchmarks, our IRPrune Det method surpasses established techniques in both model complexity and accuracy.
Researcher Affiliation Academia Mingjin Zhang1, Handi Yang1*, Jie Guo1, Yunsong Li1, Xinbo Gao1, Jing Zhang2* 1Xidian University 2The University of Sydney
Pseudocode No The paper does not contain any structured pseudocode or algorithm blocks clearly labeled as such.
Open Source Code Yes The code is available at https://github.com/hd0013/IRPrune Det.
Open Datasets Yes We adopt the NUAA-SIRST (Dai et al. 2021a) and IRSTD-1k (Zhang et al. 2022c) datasets for evaluation.
Dataset Splits Yes For each dataset, we divide IR images into three disjoint subsets: 50% for training, 30% for validation, and 20% for testing.
Hardware Specification No The paper does not provide specific hardware details (e.g., GPU/CPU models, memory amounts) used for running its experiments. It only mentions general terms like 'platforms with limited resources' or 'general-purpose hardware' in a theoretical context.
Software Dependencies No The paper mentions 'Ada Grad as the optimizer' but does not provide specific version numbers for any software dependencies or libraries used for the experiments.
Experiment Setup Yes We resize the size of each IR image in NUAA-SIRST and IRSTD-1k datasets to 512 × 512... For the pruning and training process, we utilize Ada Grad as the optimizer with a learning rate of 0.01. The training process lasts for 500 epochs with a weight decay of 10−4 and a batch size of 16. By default, we set βSCR to 0.5π and β0 to 1. We apply IRPrune Det only to a U-Net18 baseline model