Auditing Local Explanations is Hard
Authors: Robi Bhattacharjee, Ulrike Luxburg
NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We prove upper and lower bounds on the amount of queries that are needed for an auditor to succeed within this framework. Our results show that successful auditing requires a potentially exorbitant number of queries particularly in high dimensional cases. Our analysis also reveals that a key property is the locality of the provided explanations a quantity that so far has not been paid much attention to in the explainability literature. |
| Researcher Affiliation | Academia | Robi Bhattacharjee University of Tübingen and Tübingen AI Center robi.bhattacharjee@wsii.uni-tuebingen.de Ulrike von Luxburg University of Tübingen and Tübingen AI Center ulrike.luxburg@uni-tuebingen.de |
| Pseudocode | Yes | Algorithm 1 simple_audit(X, f(X), E(f, X), ϵ1, ϵ2, γ, δ) |
| Open Source Code | No | This is a theory paper, and therefore does not provide open-source code for a practical methodology. |
| Open Datasets | No | This is a theory paper and does not conduct experiments with real-world datasets. It discusses theoretical data distributions like 'µ'. |
| Dataset Splits | No | This is a theory paper and does not conduct experiments with data splits. |
| Hardware Specification | No | This is a theory paper and does not conduct experiments requiring specific hardware specifications. |
| Software Dependencies | No | This is a theory paper and does not conduct experiments requiring specific software dependencies with version numbers. |
| Experiment Setup | No | This is a theory paper and does not describe an experimental setup with hyperparameters or system-level training settings. |