Global Optimality in Bivariate Gradient-based DAG Learning

Authors: Chang Deng, Kevin Bello, Pradeep Ravikumar, Bryon Aragam

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimental results verify our theory, consistently recovering the global minimum of (1), regardless of initialization or initial penalty value. We show that our algorithm converges to the global minimum while naïve approaches can get stuck. We conducted experiments to verify that Algorithms 2 and 4 both converge to the global minimum of (7).
Researcher Affiliation Academia Booth School of Business, University of Chicago, Chicago, IL 60637 Machine Learning Department, Carnegie Mellon University, Pittsburgh, PA 15213
Pseudocode Yes Algorithm 1: Gradient Flow(f, z0), Algorithm 2: Homotopy algorithm for solving (1)., Algorithm 3: Practical (i.e. independent of a and W ) homotopy algorithm for solving (1)., Algorithm 4: Find a path {Wµk} via a particular scheduling for µk when a is unknown.
Open Source Code No The paper does not provide any concrete statement or link regarding the public availability of its source code.
Open Datasets No The paper focuses on population loss for its theoretical analysis and uses synthetic data for experiments without providing access information for a publicly available dataset. It mentions that 'the population loss can be substituted with empirical loss' but doesn't specify any public dataset used.
Dataset Splits No The paper discusses theoretical population loss and uses simulated data in its experiments, but it does not provide specific training, validation, or test dataset split information.
Hardware Specification No The paper acknowledges 'the University of Chicago Research Computing Center for assistance with the calculations' but does not provide specific hardware details such as GPU or CPU models used for the experiments.
Software Dependencies No The paper does not provide specific ancillary software details with version numbers, such as programming languages, libraries, or solvers used in the experiments.
Experiment Setup Yes Appendix F 'Experiments Details' and related figures (e.g., Figure 9) specify parameters like 'a = 2 ϵ = 0.01 δ = 0.4 µ0 = 0.2034' for Algorithm 6.