Semantic Attachments for HTN Planning
Authors: Maurício Cecílio Magnaguagno, Felipe Meneguzzi9933-9940
AAAI 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We show empirically that our planner outperforms the state-of-the-art numeric planners in a number of domains using minimal extra domain knowledge. |
| Researcher Affiliation | Academia | Maur ıcio Cec ılio Magnaguagno, Felipe Meneguzzi School of Computer Science (FACIN) Pontifical Catholic University of Rio Grande do Sul (PUCRS) Porto Alegre RS, Brazil |
| Pseudocode | Yes | Algorithm 1 corresponds to the Total-order Forward Decomposition (TFD) (Ghallab, Nau, and Traverso 2004, chapter 11). This is a recursive planner that selects one task with ordering constraints satisfied and either updates the state for primitive tasks or decomposes non-primitive tasks. |
| Open Source Code | Yes | github.com/Maumagnaguagno/Hyper Tensio N U. |
| Open Datasets | Yes | In the Plant Watering domain (Frances and Geffner 2015)... In the Car Linear domain (Bryce et al. 2015)... |
| Dataset Splits | No | The paper describes the datasets used (Plant Watering and Car Linear domains) and performs empirical tests, but does not explicitly mention training, validation, or test splits for its experiments. It compares its planner against other existing planners on these domains. |
| Hardware Specification | Yes | We conducted empirical tests with our own HTN planner2 in a machine with Dual 6-core Xeon CPUs @2GHz / 48GB memory, repeating experiments three times to obtain an average. |
| Software Dependencies | No | The paper mentions software like UJSHOP, HYPE, Ruby code, ENHSP, and Metric-FF, but does not provide specific version numbers for these components to ensure reproducibility of the software environment. |
| Experiment Setup | No | The paper describes the domains and the functionality of Semantic Attachments and how they are integrated, but it does not provide specific hyperparameters, training configurations, or detailed system-level settings for its own planner beyond the general design and comparison with other planners' configurations. |