Earlier Attention? Aspect-Aware LSTM for Aspect-Based Sentiment Analysis

Authors: Bowen Xing, Lejian Liao, Dandan Song, Jingang Wang, Fuzheng Zhang, Zhongyuan Wang, Heyan Huang

IJCAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimental results on Sem Eval-2014 Datasets demonstrate the effectiveness of AA-LSTM.
Researcher Affiliation Collaboration Bowen Xing1 , Lejian Liao1 , Dandan Song1 , Jingang Wang 2 , Fuzhen Zhang2 , Zhongyuan Wang2 and Heyan Huang1 1Lab of High Volume language Information Processing & Cloud Computing Beijing Lab of Intelligent Information Technology School of Computer Science & Technology, Beijing Institute of Technology 2Meituan-Dianping Group
Pseudocode No The paper provides mathematical equations (1-9) for the AA-LSTM network but does not include structured pseudocode or algorithm blocks.
Open Source Code No The paper does not provide any concrete access (link or explicit statement of release) to open-source code for the methodology described.
Open Datasets Yes We experiment on Sem Eval 2014 [Pontiki et al., 2014] task 4 datasets which consist of laptop and restaurant reviews and are widely used benchmarks in many previous works
Dataset Splits Yes 20% of the training data is used as the development set.
Hardware Specification No The paper does not explicitly describe the specific hardware (e.g., CPU, GPU models) used for running its experiments.
Software Dependencies No The paper mentions software like TensorFlow, Glove, and Adam optimizer, but does not provide specific version numbers for these dependencies, which are necessary for reproducible descriptions.
Experiment Setup Yes All embedding dimensions are set to 300 and the batch size is set as 16. We minimize the loss function to train our models using Adam optimizer [Diederik and Jimmy, 2014] with the learning rate set as 0.001. To avoid over fitting, we adopt the dropout strategy with p = 0.5 and the coefficient of L2 normalization in the loss function is set to 0.01.