Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].

Analytic DAG Constraints for Differentiable DAG Learning

Authors: Zhen Zhang, Ignavier Ng, Dong Gong, Yuhang Liu, Mingming Gong, Biwei Huang, Kun Zhang, Anton Hengel, Javen Qinfeng Shi

ICLR 2025 | Venue PDF | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experiments in various settings demonstrate that our DAG constraints outperform previous state-of-the-art comparators. Our implementation is available at https://github.com/zzhang1987/Analytic DAGLearning. ... EXPERIMENTS: In the experiment, we compared the performance of different analytic DAG constraints in the same path-following optimization framework. ... The results on ER2, ER3, and ER4 graphs are shown in Table 1.
Researcher Affiliation Academia 1 Australian Institute for Machine Learning, The University of Adelaide 2 Department of Philosophy, Carnegie Mellon University 3 School of Computer Science and Engineering, The University of New South Wales 4 School of Mathematics and Statistics, The University of Melbourne 5 Department of Machine Learning, Mohamed bin Zayed University of Artificial Intelligence 6 Halicio glu Data Science Institute (HDSI), UC San Diego
Pseudocode Yes Algorithm 1 Efficient Evaluation of Gradients ... Algorithm 2 Path following algorithm
Open Source Code Yes Our implementation is available at https://github.com/zzhang1987/Analytic DAGLearning.
Open Datasets Yes Our DAG constraints can also be extended to continuous nonlinear DAG learning approaches by replacing their original DAG constraints. We incorporated our DAG constraints into Lachapelle et al. (2020) to model nonlinear Structural Equation Models (SEMs) and conducted experiments using Sachs et al. (2005) s dataset pre-processed by Lachapelle et al. (2020)2. 2Available at https://github.com/kurowasan/Gra N-DAG.
Dataset Splits No We generated two different types of random graphs: ER (Erd os-Rényi) and SF (Scale-Free) graphs with different numbers of expected edges. ... Then, n samples are generated from the linear SEM x = B x + e to form an n d data matrix X, where the noise e is iid sampled from Gaussian, Exponential, or Gumbel distribution. ... We set the sample size n = 1000 and consider 3 different numbers of nodes d = 500, 1000, 2000. For each setting, we conducted 10 random simulations to obtain an average performance.
Hardware Specification Yes All these experiments were performed using an A100 GPU, and all computations were done in double precision.
Software Dependencies No We implemented the path-following algorithm (provided in Algorithm 2) using cupy and numpy based on the path-following optimizer in Bello et al. (2022). For analytic DAG constraints with infinite convergence radius, we consider the exponential-based DAG constraints.
Experiment Setup Yes In terms of hyper-parameters, our selection involves α = 0.1, λ1 = 0.1, and T = 5. For s we use the same annealing approach as Bello et al. (2022), but with our strategy to reset s when candidate graph goes out of the desired region.