Multiple Futures Prediction

Authors: Charlie Tang, Russ R. Salakhutdinov

NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We demonstrate our algorithms by predicting vehicle trajectories of both simulated and real data, demonstrating the state-of-the-art results on several vehicle trajectory datasets.
Researcher Affiliation Industry Yichuan Charlie Tang Apple Inc. yichuan_tang@apple.com Ruslan Salakhutdinov Apple Inc. rsalakhutdinov@apple.com
Pseudocode Yes We provide a detailed training algorithm pseudocode in the supplementary materials.
Open Source Code No The paper does not provide a direct link or explicit statement about the public release of its source code for the described methodology.
Open Datasets Yes We demonstrate our algorithms by predicting vehicle trajectories of both simulated and real data, demonstrating the state-of-the-art results on several vehicle trajectory datasets. ... First, we first generate simulated trajectory data from the CARLA simulator [17]... We then experiment on a widely known standard dataset of real vehicle trajectories, the NGSIM [12] dataset. ... Finally, we also benchmark MFP with previously published results on the more recent large scale Argoverse motion forecasting dataset [9].
Dataset Splits Yes We experiment with the US-101 and I-80 datasets, and follow the experimental protocol of [16], where the datasets are split into 70% training, 10% validation, and 20% testing.
Hardware Specification No The paper does not provide specific details about the hardware (e.g., CPU, GPU models, memory) used for running the experiments.
Software Dependencies No The paper mentions using GRUs and the CARLA simulator, but does not provide specific version numbers for these or other software dependencies.
Experiment Setup Yes We extract 8 seconds trajectories, using the first 3 seconds as history to predict 5 seconds into the future. ... We trained MFP (with 1 to 5 modes) on the Town01 training set for 200K updates, with minibatch size 8.