Multi-Interactive Memory Network for Aspect Based Multimodal Sentiment Analysis
Authors: Nan Xu, Wenji Mao, Guandan Chen371-378
AAAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We conduct experiment on this dataset, and the results show that our proposed model outperforms the baseline methods... The experimental results on our constructed dataset show the effectiveness of our proposed model. |
| Researcher Affiliation | Academia | Nan Xu, Wenji Mao, Guandan Chen Institute of Automation, Chinese Academy of Sciences University of Chinese Academy of Sciences, Beijing, China {xunan2015, wenji.mao, chenguandan2014}@ia.ac.cn |
| Pseudocode | No | The paper does not contain any sections explicitly labeled 'Pseudocode' or 'Algorithm', nor are there any structured code-like blocks illustrating a procedure. |
| Open Source Code | Yes | The dataset is available at https://github.com/xunan0812/MIMN. |
| Open Datasets | Yes | Hence, we provide a new publicly available multimodal aspect-level sentiment dataset... The dataset is available at https://github.com/xunan0812/MIMN. |
| Dataset Splits | Yes | We randomly divide this dataset into training set (80%), development set (10%) and test set (10%). |
| Hardware Specification | Yes | It takes about 40 seconds to train it for each epoch with one Titan X GPU. |
| Software Dependencies | No | The paper mentions 'Jieba Chinese Word segmentation tool' and the use of 'SGNS' for word embeddings, and 'Adam' for optimization, but it does not specify version numbers for these or other software libraries (e.g., Python, PyTorch/TensorFlow versions) that would be needed for reproducible setup. |
| Experiment Setup | Yes | We set the max padding length of textual content L to 320, the max padding length of aspect words N to 4... The max padding number of images K is 5. We set the dimension of the LSTM hidden representation Dh to 100, the probability dropout to 0.5, the learning rate to 0.005 and the batch size to 128. |