Shadow Generation with Decomposed Mask Prediction and Attentive Shadow Filling
Authors: Xinhao Tao, Junyan Cao, Yan Hong, Li Niu
AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Abundant experiments prove that our DMASNet achieves better visual effects and generalizes well to real composite images. Extensive experiments demonstrate that our method achieves better visual effects and generalizes well to real composite images. |
| Researcher Affiliation | Collaboration | 1Shanghai Jiao Tong University 2Tiansuan Lab, Ant Group |
| Pseudocode | No | The paper describes the proposed method in text and diagrams, but does not include structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not include an unambiguous statement about releasing code or a link to a source-code repository for the described methodology. |
| Open Datasets | Yes | To supplement DESOBA dataset, we create a large-scale dataset Rd SOBA using rendering techniques. SGRNet (Hong, Niu, and Zhang 2022) released the first synthetic dataset DESOBA for real-world scenes. |
| Dataset Splits | No | The paper mentions 'DESOBA dataset has 2792 training tuples and 580 testing tuples' but does not specify a separate validation split, nor does it provide split details for the Rd SOBA dataset. |
| Hardware Specification | Yes | We implement our model using Py Torch and train our model on 4*RTX 3090 with batch size being 16. |
| Software Dependencies | No | The paper mentions implementing the model using 'Py Torch' but does not specify a version number for PyTorch or any other software dependencies. |
| Experiment Setup | Yes | All images are resized to 256 256. We implement our model using Py Torch and train our model on 4*RTX 3090 with batch size being 16. We use the Adam optimizer with the learning rate being 0.0001 and β set to (0.5,0.999). We train RENOS for 50 epochs and DESOBA for 1000 epochs without using data augmentation. |