Incremental Image De-raining via Associative Memory

Authors: Yi Gu, Chao Wang, Jie Li

AAAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experiments demonstrate that our method can achieve better performance than existing approaches on both inhomogeneous and incremental datasets within the spectrum of highly compact systems.
Researcher Affiliation Collaboration 1 Alibaba Cloud Computing Ltd. 2 Department of Computer Science and Engineering, Shanghai Jiao Tong University luoyi.gy@alibaba-inc.com, lijiecs@sjtu.edu.cn
Pseudocode No The paper describes its methods textually and with mathematical formulations but does not include any explicitly labeled pseudocode or algorithm blocks.
Open Source Code No The paper does not contain any statement or link indicating that the source code for the described methodology is publicly available.
Open Datasets Yes We evaluate all incremental de-raining methods on four benchmark datasets: Rain100L (Yang et al. 2017), Rain100H (Yang et al. 2017), Rain800 (Zhang, Sindagi, and Patel 2019) and Rain1400 (Fu et al. 2017c).
Dataset Splits No The paper states, 'Following the previous work, we partition training and testing samples of each dataset according to the existing split.' However, it does not explicitly provide details for a validation split (percentages, counts, or explicit reference to where these splits are defined if not standard).
Hardware Specification Yes All experiments are conducted on NVIDIA Tesla V100 GPUs.
Software Dependencies No The paper does not provide specific software dependencies with version numbers, such as programming languages, libraries, or frameworks used.
Experiment Setup No The paper mentions keeping 'all training techniques and parameters setting consistent with implementations in original papers for a fair comparison,' but it does not explicitly list concrete hyperparameter values or training configurations within its own text.