Learning Causal Graphs with Small Interventions
Authors: Karthikeyan Shanmugam, Murat Kocaoglu, Alexandros G. Dimakis, Sriram Vishwanath
NeurIPS 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We simulate our new heuristic, namely Algorithm 1, on randomly generated chordal graphs and compare it with a naive algorithm that follows the intervention sets given by our (n, k) separating system as in Theorem 1. Both algorithms apply R0 and Meek rules after each intervention according to (1). We plot the following lower bounds: a) Information Theoretic LB of χ 2k b) Max. Clique Sep. Sys. Entropic LB which is the chromatic number based lower bound of Theorem 6. Moreover, we use two known (χ, k) separating system constructions for the maximum clique size as references : The best known (χ, k) separating system is shown by the label Max. Clique Sep. Sys. Achievable LB and our new simpler separating system construction (Theorem 1) is shown by Our Construction Clique Sep. Sys. LB. As an upper bound, we use the size of the best known (n, k) separating system (without any Meek rules) and is denoted Separating System UB. ... The result points to this: For random chordal graphs, the structured tree search allows us to learn the edges in a number of experiments quite close to the lower bound based only on the maximum clique size and not n. |
| Researcher Affiliation | Academia | Karthikeyan Shanmugam1, Murat Kocaoglu2, Alexandros G. Dimakis3, Sriram Vishwanath4 Department of Electrical and Computer Engineering The University of Texas at Austin, USA 1karthiksh@utexas.edu,2mkocaoglu@utexas.edu, 3dimakis@austin.utexas.edu,4sriram@ece.utexas.edu |
| Pseudocode | Yes | Algorithm 1 Hybrid Algorithm using Meek rules with separating system |
| Open Source Code | No | The paper does not provide any statement or link indicating that the source code for the described methodology is publicly available. |
| Open Datasets | No | Random generation of chordal graphs: Start with a random ordering σ on the vertices. Consider every vertex starting from σ(n). For each vertex i, (j, i) 2 E with probability inversely proportional to σ(i) for every j 2 Si where Si = {v : σ 1(v) < σ 1(i)}. The proportionality constant is changed to adjust sparsity of the graph. After all such j are considered, make Si \ ne(i) a clique by adding edges respecting the ordering σ, where ne(i) is the neighborhood of i. The resultant graph is a DAG and the corresponding skeleton is chordal. Also, σ is a perfect elimination ordering. |
| Dataset Splits | No | The paper describes generating random chordal graphs for simulation but does not provide specific details on how these graphs are split into training, validation, or test sets. No explicit percentages, sample counts, or predefined split citations are given. |
| Hardware Specification | No | The paper does not provide any specific hardware details (like GPU/CPU models, processor types, or memory amounts) used for running its experiments. |
| Software Dependencies | No | The paper does not provide specific software dependencies, libraries, or solver names with version numbers needed to replicate the experiment. |
| Experiment Setup | No | The paper describes the algorithms and how chordal graphs are generated for simulation, but it does not provide specific experimental setup details such as hyperparameter values (e.g., learning rate, batch size) or other system-level training configurations. |