Newton–Cotes Graph Neural Networks: On the Time Evolution of Dynamic Systems

Authors: Lingbing Guo, Weiqing Wang, Zhuo Chen, Ningyu Zhang, Zequn Sun, Yixuan Lai, Qiang Zhang, Huajun Chen

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments on several benchmarks empirically demonstrate consistent and significant improvement compared with the state-of-the-art methods. We conduct experiments on several datasets ranging from N-body systems to molecular dynamics and human motions [27 29], with state-of-the-art methods as baselines [17 19].
Researcher Affiliation Collaboration 1College of Computer Science and Technology, Zhejiang University 2Zhejiang University Ant Group Joint Laboratory of Knowledge Graph 3ZJU-Hangzhou Global Scientific and Technological Innovation Center 4Department of Data Science & AI, Monash University 5State Key Laboratory for Novel Software Technology, Nanjing University
Pseudocode Yes Algorithm 1 Newton-Cotes Graph Neural Network
Open Source Code Yes The source code and datasets are available at https://github.com/zjukg/NCGNN. We have also released the source code of NC (Eq Motion) in our Git Hub repository.
Open Datasets Yes We used the N-body simulation benchmark proposed in [19]. For molecular dynamics, We used MD17 [28] that consists of the molecular dynamics simulations of eight different molecules as a scenario. We also considered reasoning human motion. Follow [19], the dataset was constructed based on 23 trials in CMU Motion Capture Database [27].
Dataset Splits No The paper mentions training until 'the loss on the validation set converges' (Algorithm 1) and tracking errors on 'valid datasets' (Section 4.5). However, it does not explicitly specify the proportion or count of data used for the validation split, nor does it reference a source that defines the validation split.
Hardware Specification Yes conducted on a V100.
Software Dependencies No The paper mentions using 'Adam optimizer [53]' and adopting 'layer normalization [54] and Re LU activation [55])'. However, it does not provide specific version numbers for core software dependencies like Python, PyTorch/TensorFlow, or other libraries used for implementation.
Experiment Setup Yes We list the main hyper-parameter setting of NC with different EGNN models on different datasets in Table 6. (Table 6 specifies details such as velocity regularization, epochs, batch size, learning rate, and optimizer.)