Ensemble Neural Relation Extraction with Adaptive Boosting
Authors: Dongdong Yang, Senzhang Wang, Zhoujun Li
IJCAI 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiment results on real dataset demonstrate the superior performance of the proposed model, improving F1-score by about 8% compared to the state-of-the-art models. |
| Researcher Affiliation | Academia | 1 Unversity of Southern California 2 Nanjing University of Aeronautics and Astronautics 3 Beihang University |
| Pseudocode | Yes | The pseudocode of the Ada-LSTMs model is given in Algorithm 1. |
| Open Source Code | No | The paper mentions using 'word2vec' and provides a link to its project page (http://code.google.com/p/word2vec), but it does not provide a link or statement about the open-source availability of the code for its own proposed methodology (Ada-LSTMs). |
| Open Datasets | Yes | We evaluate our model on the public dataset 1, which is developed by [Riedel et al., 2010]. http://iesl.cs.umass.edu/riedel/ecml/ |
| Dataset Splits | Yes | After filtering part of NA negative data, the training part is gained by aligning the sentences from 2005 to 2006 in NYT and contains 176,662 non-repeated sentences, among which there are 156,662 positive samples and 20,000 NA negative samples. |
| Hardware Specification | No | The paper does not mention any specific hardware specifications (e.g., GPU models, CPU types, memory) used for running the experiments. |
| Software Dependencies | No | The paper mentions using 'word2vec' but does not specify its version number or any other software dependencies with version details. |
| Experiment Setup | Yes | Table 2: Parameter Settings. This table explicitly lists: 'Number of epochs 40', 'LSTMs unit size 350', 'Dropout probability 0.5', 'Batch size 50', 'Position dimension 5', 'Word dimension 50', 'Unrolled steps of LSTMs 70', 'Number of neural networks 20', 'Initial learning Rate 10^-3', 'L2 regulation Coefficient 10^-4'. |