Towards Automatic Composition of ASP Programs from Natural Language Specifications
Authors: Manuel Borroto Santana, Irfan Kareem, Francesco Ricca
IJCAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | An experiment confirms the viability of the approach. |
| Researcher Affiliation | Academia | Department of Mathematics and Computer Science, University of Calabria, Rende CS, 87036, Italy {manuel.borroto,irfan.kareem,francesco.ricca}@unical.it |
| Pseudocode | No | The paper describes algorithms and architectures in prose and figures but does not include any formal pseudocode or algorithm blocks. |
| Open Source Code | Yes | Datasets and tools are available at https://github.com/Irfan Kareem/nl2asp. |
| Open Datasets | Yes | Datasets and tools are available at https://github.com/Irfan Kareem/nl2asp. |
| Dataset Splits | Yes | In the first experiment, we performed K-fold cross-validation, models T5 and BART are trained k times, the dataset is divided in one fold as a test set, and k-1 folds are used as the training set. ... The value of k was set to 5. |
| Hardware Specification | Yes | We used an Ubuntu 20.04 server with 500GB of RAM, a 16 GB NVIDIA Tesla V100-PCIE GPU card, and CUDA 11.8. |
| Software Dependencies | Yes | The models were implemented using Keras 2.12.0 on top of Tensorflow and the Transformers 4.30.2 library. ... The code ran using Python 3.9.17 in a Jupyter Notebook environment. |
| Experiment Setup | Yes | The batch size was set to 16. The Adam Weight Decay (Adam W) optimizer is used... learning rate= 2 10 5, and weight decay= 0.01. ... We ran the training of the models for 200 epochs for each split and applied the Early Stopping technique to monitor the validation loss with minimum mode and patience equal to 20 |