Implicit Prompt Learning for Image Denoising

Authors: Yao Lu, Bo Jiang, Guangming Lu, Bob Zhang

IJCAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experiments on multiple benchmarks showed that the proposed IPLID achieves competitive results with only 1 percent of pre-trained backbone parameters, outperforming classical denoising methods in both efficiency and quality of restored images.
Researcher Affiliation Academia 1Department of Computer Science, Harbin Institute of Technology at Shenzhen, China 2College of Mechanical and Electronic Engineering, Northwest A&F University, China 3Department of Computer and Information Science, University of Macau, China
Pseudocode No The paper presents figures for block structures and mathematical equations but does not include any section or figure explicitly labeled 'Pseudocode' or 'Algorithm'.
Open Source Code No The paper does not provide any explicit statement about releasing source code or a link to a code repository.
Open Datasets Yes Table 1 presents the denoising results of the proposed IPLID on real noisy images from the SIDD [Abdelhamed et al., 2018], Poly U [Xu et al., 2018], and Nam [Nam et al., 2016] datasets.
Dataset Splits No The paper mentions using datasets like SIDD, Poly U, and Nam, and refers to DIV2K and BSD400 for fine-tuning, but does not provide explicit training/validation/test splits (e.g., percentages or sample counts) for the experiments conducted.
Hardware Specification Yes To ensure equitable comparisons of efficiency, we utilize FLOPs, inference time, and trainable parameters as metrics. In particular, we perform the comparisons on identical computer equipment (i.e., an Nvidia RTX Titan GPU) for efficiency.
Software Dependencies No The paper mentions using the 'Adam optimizer' but does not provide specific version numbers for software dependencies or libraries (e.g., PyTorch, TensorFlow, CUDA versions).
Experiment Setup Yes We use the Adam [Loshchilov and Hutter, 2017] optimizer to train our IPLID with setting β1 and β2 to 0.9 and 0.999, respectively. The learning rate is set to 1 10 4 in training.