Primal Grammars Driven Automated Induction
Authors: Adel Bouhoula, Miki Hermann
IJCAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our method has been implemented in C++ and successfully proved over fifty complex examples that fail with well-known theorem provers (e.g., ACL2, Isabelle, PVS, SPIKE) and related methods for handling divergence in proofs by induction. |
| Researcher Affiliation | Academia | Adel Bouhoula1 and Miki Hermann2 1Arabian Gulf University, Bahrain 2LIX, CNRS, Ecole Polytechnique, Institut Polytechnique de Paris, France |
| Pseudocode | Yes | Algorithm 1 Production of a primal grammar from a finite prefix of an infinite divergent sequence. |
| Open Source Code | Yes | More examples can be found at the github page https://github.com/Bouhoula Hermann/IJCAI-2024. |
| Open Datasets | No | The paper discusses evaluating its method on |
| Dataset Splits | No | The paper does not explicitly mention train, validation, or test dataset splits. |
| Hardware Specification | Yes | All times are measured on a Dell Precision 3630 Tower with Intel Core i7-9700 8 processor and 16GB memory, running under Fedora Linux 39. |
| Software Dependencies | No | The paper states, "Our method has been implemented in C++" and mentions "running under Fedora Linux 39," but it does not specify any version numbers for compilers, libraries, or other software dependencies crucial for replication. |
| Experiment Setup | No | The paper provides examples of divergent sequences and how the method constructs primal grammars and lemmas, but it does not detail specific experimental setup parameters such as hyperparameters, learning rates, or model-specific training configurations. |