Encoder-Decoder Based Unified Semantic Role Labeling with Label-Aware Syntax
Authors: Hao Fei, Fei Li, Bobo Li, Donghong Ji12794-12802
AAAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Empirical experiments show that our framework significantly outperforms all existing graph-based methods on the Co NLL09 and Universal Proposition Bank datasets. |
| Researcher Affiliation | Academia | Hao Fei, Fei Li, Bobo Li, Donghong Ji* Key Laboratory of Aerospace Information Security and Trusted Computing, Ministry of Education, School of Cyber Science and Engineering, Wuhan University, Wuhan, China {hao.fei, boboli, dhji}@whu.edu.cn, foxlf823@gmail.com |
| Pseudocode | No | The paper does not contain structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide concrete access to source code for the methodology described in this paper. It only links to the BERT repository, which is a third-party tool. |
| Open Datasets | Yes | We train and evaluate all models on two SRL benchmarks, including Co NLL09 (English), and Universal Proposition Bank (eight languages). |
| Dataset Splits | Yes | We employ the official training, development and test sets in each dataset. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory amounts) used for running its experiments. |
| Software Dependencies | No | The paper mentions using BERT (base-cased version) and an Adam optimizer, but does not provide specific software dependencies like programming language or library versions (e.g., Python 3.x, PyTorch 1.x). |
| Experiment Setup | Yes | In terms of hyper-parameters, since BERT is used, the size of word representations is 768. The size of POS tag embeddings is 50. We use a 3-layer LA-GCN with 350 hidden units, and the output size of LSTM decoder is 300. We adopt the Adam optimizer with an initial learning rate 2e-5, mini-batch size 16 and regularization weight 0.12. |