Chinese Overt Pronoun Resolution: A Bilingual Approach
Authors: Chen Chen, Vincent Ng
AAAI 2014 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments on the OntoNotes corpus demonstrate that our bilingual approach to Chinese pronoun resolution significantly surpasses the performance of state of the art monolingual approaches. |
| Researcher Affiliation | Academia | Chen Chen and Vincent Ng Human Language Technology Research Institute University of Texas at Dallas Richardson TX cchen, vincen@hlt.utdallas.edu |
| Pseudocode | No | The paper describes the approach steps in prose but does not include structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | The complete list of features can be found in the source code of the resolver. See http://www.ims.uni-stuttgart.de/forschung/ressourcen/werkzeuge/IMSCoref.en.html |
| Open Datasets | Yes | We use the OntoNotes corpus that we obtained from the CoNLL shared task organizers for evaluating our bilingual approach to Chinese pronoun resolution. |
| Dataset Splits | Yes | We follow the shared task’s train/test partition of the documents performing training and parameter tuning on the training and development documents and reserving the test documents solely for evaluation purposes. Specifically, when Method 4 is employed, which requires parameter tuning, we train the resolvers on the training set and tune the parameters on the development set. |
| Hardware Specification | No | No specific hardware details (e.g., GPU models, CPU specifications, or memory) used for running the experiments are mentioned in the paper. |
| Software Dependencies | No | No specific version numbers for software dependencies are provided. The paper mentions using the LIBSVM software package and BerkeleyAligner, but without version details. |
| Experiment Setup | No | While the paper mentions tunable parameters for Method 4 and a hill climbing local search algorithm for tuning, it does not provide the specific values for these parameters (e.g., learning rate, batch size, number of epochs) or other detailed experimental setup configurations. |