Inferring Relational Potentials in Interacting Systems
Authors: Armand Comas, Yilun Du, Christian Fernandez Lopez, Sandesh Ghimire, Mario Sznaier, Joshua B. Tenenbaum, Octavia Camps
ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In this section we firstly describe our datasets (Section 4.1) and baselines (Section 4.2). Following, in Section 4.3, we discuss experiments on (i.) recombination of interaction types across datasets and (ii.) contribution of the potentials. Next, in Section 4.4, we describe out-of-distribution sample detection experiments. We show how to incorporate test-time potentials in Section 4.5. Finally, we describe the quantitative results for trajectory forecasting in Section 4.6. |
| Researcher Affiliation | Academia | 1Northeastern University 2Massachusetts Institute of Technology. |
| Pseudocode | Yes | Algorithm 1 Training algorithm for NIIP. |
| Open Source Code | Yes | Website: https: //energy-based-model.github.io/ interaction-potentials. |
| Open Datasets | Yes | We test our model in three different domains. First, we carry on experiments in two simulated environments: (i.) Particles connected by springs, and (ii.) Particles with charges. Next, we test several properties of our model in (iii.) NBA Sport Vu motion dataset, which displays real motion from tracked basketball players along several NBA games. Finally, we test our performance in (iv.) JPL Horizons, a physics-based realistic dataset. ... Following the experimental setting described in (Kipf et al., 2018), we generate states (position and velocity) of a dynamical system for N = 5 particles for 70 time-steps. |
| Dataset Splits | Yes | Simulated data. ... We generate 50k training samples and 10k for validation and test splits. ... NBA Sport VU ... The dataset is composed of 50k samples for training and 1k samples for validation and test. ... JPL Horizons ... We gather 1880 trajectories of 43 timesteps split as 1504/188/188 for train, validation and test. |
| Hardware Specification | Yes | Hardware: For each of our experiments we used 1 GPU RTX 2080 Ti (Blower Edition) with 12.8GB of memory. |
| Software Dependencies | Yes | Software: We implemented this method using Ubuntu 18.04, Python 3.6, Pytorch 1.10, Cuda 11.2 and several additional libraries which will be provided as a environment requirements file. |
| Experiment Setup | Yes | In all cases, NIIP uses Adam optimizer and a learning rate of LR = 3e 4 with a scheduled decay of γ = 0.5 every 100k iterations. ... NIIP is trained with 2 energy functions and latent size per edge potential of Dz = 64. We use hidden layers of size 256. We encode 49 time-steps into a set of potentials, fix 1 time-step from the ground-truth and predict the following 20. We use a number of sampling steps M = 5 and a step-size of λ = 0.4. We use a batch size of 40 and train for 500 epochs. |