Neural Enquirer: Learning to Query Tables in Natural Language
Authors: Pengcheng Yin, Zhengdong Lu, Hang Li, Ben Kao
IJCAI 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | As a proof-of-concept, we conduct experiments on a synthetic QA task, and demonstrate that the model can learn to execute reasonably complex NL queries on small-scale KB tables. |
| Researcher Affiliation | Collaboration | 1 Department of Computer Science, The University of Hong Kong 2 Noah s Ark Lab, Huawei Technologies |
| Pseudocode | No | The paper does not contain any clearly labeled pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any concrete access information (e.g., repository link, explicit release statement) for its source code. |
| Open Datasets | No | The paper describes generating a 'synthetic dataset' but does not provide any specific link, DOI, or formal citation for accessing it as a publicly available resource. |
| Dataset Splits | No | The paper mentions training and testing sets ('25K and 100K training examples', '20K examples' for testing) but does not explicitly specify a validation dataset split. |
| Hardware Specification | Yes | We train the model using ADADELTA [Zeiler, 2012] on a Tesla K40 GPU. |
| Software Dependencies | No | The paper mentions using 'ADADELTA' but does not provide specific version numbers for any libraries, frameworks, or other software components. |
| Experiment Setup | Yes | The lengths of hidden states for GRU and DNNs are 150, 50. The numbers of layers for DNN()2 and DNN()3 are 2, 3, 3. The length of word embeddings and annotations is 20. is 0.2. |