LLCP: Learning Latent Causal Processes for Reasoning-based Video Question Answer
Authors: Guangyi Chen, Yuke Li, Xiao Liu, Zijian Li, Eman Al Suradi, Donglai Wei, Kun Zhang
ICLR 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We assess the efficacy of LLCP on both synthetic and real-world data, demonstrating comparable performance to supervised methods despite our framework using no paired textual annotations. |
| Researcher Affiliation | Academia | Mohamed bin Zayed University of Artificial Intelligence, Abu Dhabi, UAE Carnegie Mellon University, Pittsburgh PA, USA Boston College , Massachusetts, USA |
| Pseudocode | No | The paper includes figures describing model architectures and data flow, but it does not provide any pseudocode or algorithm blocks. |
| Open Source Code | Yes | The code is available at https://github.com/CHENGY12/LLCP. |
| Open Datasets | Yes | We assess the efficacy of LLCP on both synthetic and real-world data. It consists of SUTD-Traffic QA Xu et al. (2021) for accident attribution and Causal-Vid QA Li et al. (2022a) for counterfactual prediction. |
| Dataset Splits | Yes | For simulating counterfactual inference, we generate paired validation data comprising both factual evidence and counterfactual events (which share identical noise). |
| Hardware Specification | Yes | The model is implemented in Py Torch 1.13.1 and trained on a single Tesla V100 GPU. |
| Software Dependencies | Yes | The model is implemented in Py Torch 1.13.1 and trained on a single Tesla V100 GPU. |
| Experiment Setup | Yes | For the loss function, we set the β = 0.5 to balance the reconstruction and KLD. We apply Adam Kingma & Ba (2014) as the optimizer with the initial learning rate as 1e-4 and batch size as 16. The learning rate decay is set to 0.5 to reduce the learning rate to half in every 10 epochs. The number of total epochs is set as 50. |