Generative Attention Networks for Multi-Agent Behavioral Modeling

Authors: Guangyu Li, Bo Jiang, Hao Zhu, Zhengping Che, Yan Liu7195-7202

AAAI 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We conduct extensive quantitative and qualitative analysis of GAMAN on three inherently different multi-agent systems, namely: spring-ball system, basketball games and traffics in a university campus, illustrated in Figure 1. Experiments show that GAMAN supports accurate predictions as well as system identification in which hidden properties are abducted from observations, in all three domains.
Researcher Affiliation Collaboration 1University of Southern California, Los Angeles, US 2AI Labs, Didi Chuxing, Beijing, China 3Peking University, Beijing, China
Pseudocode Yes Algorithm 1 Generation Network of GAMAN in a K-agent system with mixture of Gaussian emission distribution. Algorithm 2 Learning GAMAN with stochastic backpropagation and Adam optimizer (Kingma and Ba 2014).
Open Source Code No The paper does not explicitly state that source code for the described methodology is provided, nor does it include a link to a code repository.
Open Datasets Yes We conduct experiments on one simulated physical dataset, Spring-Ball, and two real-world large scale multi-agent datasets, Stanford Drone (Robicquet et al. 2016) and NBA (Linou 2016).
Dataset Splits Yes We randomly split each dataset into the training/validation/test set with the ratio of 7:1:2, choose the best model weights on the validation set, and report the performance on the test set.
Hardware Specification No The paper does not provide specific details about the hardware used for running the experiments (e.g., GPU models, CPU types, memory). It only mentions general setups implicitly through performance.
Software Dependencies No The paper mentions 'pykalman (Duckworth 2013)' and 'Adam (Kingma and Ba 2014)' but does not provide specific version numbers for these or other key software components or libraries (e.g., Python, PyTorch, TensorFlow).
Experiment Setup Yes All sizes of hidden layers are 32 and model is optimized with Adam (Kingma and Ba 2014).