Adaptive Multi-Compositionality for Recursive Neural Models with Applications to Sentiment Analysis
Authors: Li Dong, Furu Wei, Ming Zhou, Ke Xu
AAAI 2014 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We integrate Ada MC into existing recursive neural models and conduct extensive experiments on the Stanford Sentiment Treebank. The results illustrate that Ada MC significantly outperforms state-of-the-art sentiment classification methods. |
| Researcher Affiliation | Collaboration | State Key Lab of Software Development Environment, Beihang University, Beijing, China Microsoft Research, Beijing, China |
| Pseudocode | No | The paper does not contain structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not include an unambiguous statement about releasing source code for the described methodology or a direct link to a code repository. |
| Open Datasets | Yes | We evaluate the models on Stanford Sentiment Treebank1. This corpus contains the labels of syntactically plausible phrases... 1http://nlp.stanford.edu/sentiment/treebank.html |
| Dataset Splits | Yes | We use the standard dataset splits (train: 8,544, dev: 1,101, test: 2,210) in all the experiments. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running experiments. |
| Software Dependencies | No | The paper mentions 'Ada Grad' as an optimization algorithm, but does not provide specific software dependencies with version numbers. |
| Experiment Setup | Yes | We use the mini-batch version Ada Grad in our experiments with the batch size between 20 and 30. We employ f = tanh as the nonlinearity function. To initialize the parameters, we randomly sample values from a uniform distribution U ( , + ), where is a small value. |