Transition-constant Normalization for Image Enhancement

Authors: Jie Huang, man zhou, Jinghao Zhang, Gang Yang, Mingde Yao, Chongyi Li, Zhiwei Xiong, Feng Zhao

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Through extensive experiments on multiple image enhancement tasks, like low-light enhancement, exposure correction, SDR2HDR translation, and image dehazing, our TCN consistently demonstrates performance improvements.
Researcher Affiliation Academia Jie Huang1 , Man Zhou1,2 , Jinghao Zhang1 , Gang Yang1, Mingde Yao1, Chongyi Li3, Zhiwei Xiong1, Feng Zhao1 1University of Science and Technology of China, China 2Nanyang Technological University, Singapore 3Nankai University, China
Pseudocode No The paper presents operational descriptions with mathematical equations and diagrams (Fig. 2, Fig. 3) but does not include a dedicated pseudocode or algorithm block.
Open Source Code Yes The code is available at https://github.com/huangkevinj/TCNorm.
Open Datasets Yes Following previous works [60, 61], we employ three widely used datasets for evaluation, including LOL dataset [7], Huawei dataset [60] and MIT-Five K dataset [59].
Dataset Splits No The paper states training on '1000 samples from MIT-Five K dataset' and testing on '100 samples from the same dataset' but does not explicitly define or specify a separate validation dataset or its size for reproducibility, nor for other datasets.
Hardware Specification No The acknowledgements section mentions 'We acknowledge the support of GPU cluster built by MCC Lab of Information Science and Technology Institution, USTC,' but no specific GPU models, CPU types, or detailed computer specifications used for experiments are provided in the paper.
Software Dependencies No The paper does not provide specific software versions for dependencies (e.g., 'Python 3.x', 'PyTorch x.x') in its main text.
Experiment Setup Yes We train all baselines and their integrated formats following the original settings, and our TCN-Net until it converges. More implementation details are provided in the supplementary.