Spacetime Representation Learning

Authors: Marc T. Law, James Lucas

ICLR 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental 5 EXPERIMENTS We show how our framework can represent graphs with directed cycles and predict links effectively in Section 5.1. We also show how the causal interpretation of our model can be used to represent hierarchical graphs with cycles in Section 5.2.
Researcher Affiliation Industry Marc T. Law & James Lucas NVIDIA
Pseudocode Yes Algorithm 1 Pseudo-Riemannian optimization
Open Source Code No The paper mentions thanking Aaron Sim for sharing his source code, but does not state that the authors’ own source code is released or provide a link.
Open Datasets Yes We now consider the link prediction task on the Saccharomyces cerevisiae, in silico and Escherichia coli DREAM5 datasets (Marbach et al., 2012); Zachary s karate club dataset (Zachary, 1977); NIPS from 1988 to 2003 (Globerson et al., 2007); Arxiv High-energy physics theory (HEP-TH) citation network (Gehrke et al., 2003).
Dataset Splits Yes Each network is randomly split into train and test sets, following 85/15 splits, and a part of the training set is used for validation.
Hardware Specification Yes We ran all our experiments on a single desktop with 64 GB of RAM, a 6-core Intel i7-7800X CPU and a NVIDIA Ge Force RTX 3090 GPU.
Software Dependencies No The paper does not provide specific version numbers for software dependencies used in the experiments.
Experiment Setup Yes We use the following hyperparameters on the DREAM5 datasets to define equation 6: if M = Sd 1(r), r = 1, θ1 = 0.15, exponent m = 1, θ2 = 0.03, learning rate = 10 5, number of epochs = 2000. if M = Ld 1(C), C = 8, θ1 = 0.15, θ2 = 0.03, exponent m = 1, learning rate = 10 3, number of epochs = 2000. if M = Rd 1, θ1 = 0.15, θ2 = 0.03, exponent m = 1, learning rate = 10 3, number of epochs = 2000.