Understanding Inter-Concept Relationships in Concept-Based Models
Authors: Naveen Janaki Raman, Mateo Espinosa Zarlenga, Mateja Jamnik
ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | First, we empirically demonstrate that state-of-the-art concept-based models produce representations that lack stability and robustness, and such methods fail to capture inter-concept relationships. Then, we develop a novel algorithm which leverages inter-concept relationships to improve concept intervention accuracy, demonstrating how correctly capturing inter-concept relationships can improve downstream tasks. |
| Researcher Affiliation | Academia | Naveen Raman 1 Mateo Espinosa Zarlenga 2 Mateja Jamnik 2 1Carnegie Mellon University 2University of Cambridge. |
| Pseudocode | Yes | Algorithm 1 Basis Aided Concept Intervention |
| Open Source Code | Yes | Our code is available here: https://github.com/ naveenr414/Concept-Learning. |
| Open Datasets | Yes | Coloured MNIST (Arjovsky et al., 2019), d Sprites (Matthey et al., 2017), CUB (Wah et al., 2011), Che Xpert (Irvin et al., 2019) |
| Dataset Splits | Yes | For the d Sprites and Che Xpert datasets, we use 2,500 data points for the training, and 750 for validation and testing. For the MNIST dataset, we use 60,000 data points for training and 10,000 data points for validation. For CUB, we use 4,796 data points for training, 1,198 for validation, and 5,794 data points for testing. |
| Hardware Specification | Yes | We run our GPU experiments on either an NVIDIA TITAN Xp with 12 GB of GPU RAM on Ubuntu 20.04, or NVIDIA A100-SXM, using at most 8 GB of GPU with Red Hat Linux 8. |
| Software Dependencies | No | For concept intervention experiments we use the Py Torch library (Paszke et al., 2019). (Does not include version number) |
| Experiment Setup | Yes | We train models for 25, 50, and 100 epochs and measure the impact of label bases on concept intervention accuracy. |