Position: Relational Deep Learning - Graph Representation Learning on Relational Databases

Authors: Matthias Fey, Weihua Hu, Kexin Huang, Jan Eric Lenssen, Rishabh Ranjan, Joshua Robinson, Rex Ying, Jiaxuan You, Jure Leskovec

ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental This position paper introduces Relational Deep Learning (RDL), a blueprint for end-to-end learning on relational databases. ... We also introduce RELBENCH, and benchmark and testing suite, demonstrating strong initial results.
Researcher Affiliation Collaboration 1Kumo.AI 2Stanford University 3Max Planck Institute for Informatics 4Yale University 5University of Illinois at Urbana-Champaign.
Pseudocode Yes Algorithm 1 Time-Consistent Computation Graph
Open Source Code Yes Prototype Implementation. RELBENCH, an open-source implementation of RDL based on Py Torch Frame for tabular learning (Hu et al., 2024) and Py Torch Geometric for graph neural networks (Fey & Lenssen, 2019). ... Website: https://relbench.stanford.edu/
Open Datasets Yes RELBENCH includes: (1) a general data loading library intended to make it easy to load relational databases ready for model training; (2) two initial databases: Amazon review records, and Stack Exchange, and two predictive tasks for each database. ... Website: https://relbench.stanford.edu/
Dataset Splits Yes Every dataset in RELBENCH has a validation timestamp tval and a test timestamp ttest. These are shared for all tasks in the dataset. ... RELBENCH also provides default train and validation tables. The default validation table is constructed similar to the test table, but with the time window being tval to tval + δ.
Hardware Specification No The paper mentions 'mini-batching on GPUs' but does not specify any particular GPU models, CPU models, or other detailed hardware specifications used for experiments.
Software Dependencies No The paper states RELBENCH is 'based on Py Torch Frame for tabular learning (Hu et al., 2024) and Py Torch Geometric for graph neural networks (Fey & Lenssen, 2019)', citing the relevant papers. However, it does not specify version numbers for these or other software dependencies.
Experiment Setup No The paper describes the overall architecture and data processing pipeline but does not provide specific hyperparameter values (e.g., learning rate, batch size, number of epochs) or detailed system-level training settings in the main text.