KUNet: Imaging Knowledge-Inspired Single HDR Image Reconstruction
Authors: Hu Wang, Mao Ye, Xiatian Zhu, Shuai Li, Ce Zhu, Xue Li
IJCAI 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experimental results demonstrate that the proposed KUNet achieves superior performance compared with the state-of-the-art methods. |
| Researcher Affiliation | Academia | 1School of CSE, University of Electronic Science and Technology of China, Chengdu, China 2Surrey Institute for People-Centred Artificial Intelligence, CVSSP, University of Surrey, Guildford, UK 3School of Control Science and Engineering, Shandong University, Jinan, China 4School of ICE, University of Electronic Science and Technology of China, Chengdu, China 5School of ITEE, The University of Queensland, Brisbane, Australia |
| Pseudocode | No | The paper does not contain any structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | The code, dataset and appendix materials are available at https://github.com/wanghu178/KUNet.git. |
| Open Datasets | Yes | The code, dataset and appendix materials are available at https://github.com/wanghu178/KUNet.git. For the image task, following the works in [Chen et al., 2021a; Liu et al., 2021], the NITRE 2021 dataset is used which was proposed in NITRE 2021 HDR Challenge [P erez-Pellitero et al., 2021] selected from HDM HDR dataset [Froehlich et al., 2014]. |
| Dataset Splits | No | Since the ground truth of test and validation images are not available, by similar operation in [Chen et al., 2021a], the original training set is decomposed into two parts for training and test. They are 1416 paired training images and 78 test images. This text does not explicitly state a validation split. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory specifications) used for running the experiments. |
| Software Dependencies | No | The paper states 'All models are built on the Py Torch framework' but does not provide specific version numbers for PyTorch or any other software dependencies. |
| Experiment Setup | No | The paper states 'Due to space limitations, more details can be obtained from the Appendix.' and does not provide specific hyperparameters or system-level training settings in the main text. |