Delivering Inflated Explanations
Authors: Yacine Izza, Alexey Ignatiev, Peter J. Stuckey, Joao Marques-Silva
AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments This section presents a summary of empirical assessment of computing inflated abductive and contrastive explanations for the case study of random forests (RFs) trained on some widely studied datasets. ... Results. Summary results of computing i AXp s for RFs on the selected datasets is reported in Table 1. |
| Researcher Affiliation | Academia | Yacine Izza1, Alexey Ignatiev2, Peter J. Stuckey2,3, Joao Marques-Silva4 1CREATE, National University of Singapore, Singapore 2Monash University, Melbourne, Australia 3OPTIMA ARC Industrial Training and Transformation Centre, Melbourne, Australia 4IRIT, CNRS, Toulouse, France |
| Pseudocode | Yes | Algorithm 1: Computing inflated explanations ... Algorithm 2: Inflate categorical feature ... Algorithm 3: Inflate ordinal feature ... Algorithm 4: Inflating the supremum with linear search ... Algorithm 5: Inflating the infimum with linear search |
| Open Source Code | Yes | Prototype implementation. A prototype implementation of the outlined algorithms (Algorithms 1 to 5) were developed as a Python script. It builds on RFxpl3 (Izza and Marques-Silva 2021) and makes heavy use of the latest version of the Py SAT toolkits (Ignatiev, Morgado, and Marques-Silva 2018) to generate the CNF formulas of the RF encodings and afterwards instrument incremental SAT oracle calls to compute explanations. ... 3Available at https://github.com/izzayacine/RFxpl. |
| Open Datasets | Yes | The assessment is performed on benchmarks of (Izza and Marques-Silva 2021), where we selected 35 RF models trained on well-known tabular datasets (all publicly available and originate from UCI repository (Markelle Kelly 2020) and PMLB (Olson et al. 2017)). |
| Dataset Splits | No | Besides, our formal explainers are set to compute a single AXp and then apply the expansion method, per data instance from the selected set of instances and 200 samples are randomly to be tested for each dataset. |
| Hardware Specification | Yes | The experiments are conducted on a Mac Book Pro with a Dual-Core Intel Core i5 2.3GHz CPU with 8GByte RAM running mac OS Ventura. |
| Software Dependencies | No | A prototype implementation of the outlined algorithms (Algorithms 1 to 5) were developed as a Python script. It builds on RFxpl3 (Izza and Marques-Silva 2021) and makes heavy use of the latest version of the Py SAT toolkits (Ignatiev, Morgado, and Marques-Silva 2018) to generate the CNF formulas of the RF encodings and afterwards instrument incremental SAT oracle calls to compute explanations. |
| Experiment Setup | Yes | The number of trees in each RF model is set to 100, while tree depth varies between 3 and 10. |