Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].
MGQFormer: Mask-Guided Query-Based Transformer for Image Manipulation Localization
Authors: Kunlun Zeng, Ri Cheng, Weimin Tan, Bo Yan
AAAI 2024 | Venue PDF | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments on multiple benchmarks show that our method significantly improves over state-of-the-art methods. Experiment Experiment Setup Testing Datasets. We first pre-train our model with the dataset synthesized by PSCC-Net (Liu et al. 2022). Then we evaluate our model on CASIA dataset (Dong, Wang, and Tan 2013), Columbia dataset (Hsu and Chang 2006), NIST16 dataset (Guan et al. 2019) and IMD20 dataset (Novozamsky, Mahdian, and Saic 2020). |
| Researcher Affiliation | Academia | Kunlun Zeng*, Ri Cheng*, Weimin Tan , Bo Yan School of Computer Science, Shanghai Key Laboratory of Intelligent Information Processing, Fudan University EMAIL, EMAIL, EMAIL, EMAIL |
| Pseudocode | No | No pseudocode or clearly labeled algorithm block was found in the paper. |
| Open Source Code | No | The paper does not provide any statement or link indicating that the source code for the methodology is openly available. |
| Open Datasets | Yes | Experiment Experiment Setup Testing Datasets. We first pre-train our model with the dataset synthesized by PSCC-Net (Liu et al. 2022). Then we evaluate our model on CASIA dataset (Dong, Wang, and Tan 2013), Columbia dataset (Hsu and Chang 2006), NIST16 dataset (Guan et al. 2019) and IMD20 dataset (Novozamsky, Mahdian, and Saic 2020). |
| Dataset Splits | Yes | As shown in Figure 3, we display the AUC (%) scores on the validation split of the synthesized dataset during training. |
| Hardware Specification | Yes | The MGQFormer is implemented on the Pytorch with an NVIDIA GTX 1080 Ti GPU. |
| Software Dependencies | No | The paper mentions 'Pytorch' but does not specify a version number or other software dependencies with specific versions. |
| Experiment Setup | Yes | Implementation Details. The MGQFormer is implemented on the Pytorch with an NVIDIA GTX 1080 Ti GPU. All input images are resized to 384 x 384. We use Adam as the optimizer, and the learning rate decays from 2.5e-7 to 1.5e-8 with a batch size of 2. |