Iterative Structural Inference of Directed Graphs
Authors: Aoran Wang, Jun Pang
NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We evaluate i SIDG on various datasets including biological networks, simulated f MRI data, and physical simulations to demonstrate that our model is able to precisely infer the existence of interactions, and is significantly superior to baseline models. |
| Researcher Affiliation | Academia | Aoran Wang University of Luxembourg aoran.wang@uni.lu Jun Pang University of Luxembourg jun.pang@uni.lu |
| Pseudocode | No | The paper describes its methods through mathematical equations and textual explanations, but it does not include any explicitly labeled 'Pseudocode' or 'Algorithm' blocks or figures. |
| Open Source Code | Yes | We attach the link to our code and data in the supplementary documents. |
| Open Datasets | Yes | We test our model on the six directed synthetic biological networks [35], namely Linear (LI), Linear Long (LL), Cycle (CY), Bifurcating (BF), Trifurcating (TF), and Bifurcating Converging (BFCV) networks... We also test our model on Net Sim datasets [43] of simulated f MRI data... The last datasets to be mentioned here are three physical simulations mentioned in [22]... Furthermore, we also test our model and baselines on two gene regulatory network (GRN) datasets: ESC dataset [4] and HSC dataset [32]. |
| Dataset Splits | No | The paper describes using 'training samples' and running experiments 'several rounds' to obtain results, but it does not provide specific details on how the datasets were split into training, validation, or test sets, nor does it specify exact percentages or sample counts for these splits. |
| Hardware Specification | Yes | The experiments are conducted on an NVIDIA RTX 3090 GPU with 24 GB memory. |
| Software Dependencies | Yes | The proposed model is implemented in Python 3.8 based on PyTorch 1.10.0 [33] and scikit-learn 1.0.1 [34]. |
| Experiment Setup | Yes | For our i SIDG, the hidden dimension of the GIN layers in the encoder is set to 64, with two layers in total. The output dimension of the encoder is 20 for the biological networks, and 30 for the rest of the datasets. The learning rate for the training is 0.001 with Adam as optimizer. The models are trained for 100 epochs, with batch size of 64. For the Net Sim datasets and physical simulations, the number of epochs is 200, and the batch size is 128. We found that the performance is insensitive to the settings of hyperparameters {µ, α, β, γ}. We set them to {0.1, 0.01, 0.01, 0.01} for all the experiments. The value of threshold ξ is 0.5. The combination coefficient λ is set to 0.5. The iterative process runs 100 steps in total. The stop condition is δ = 0.0001. |