Hierarchical Relational Inference
Authors: Aleksandar Stanić, Sjoerd van Steenkiste, Jürgen Schmidhuber9730-9738
AAAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In this section we evaluate HRI on four different dynamics modelling tasks: state trajectories of objects connected via finite-length springs in a hierarchical structure (statesprings); corresponding rendered videos (visual-springs); rendered joints of human moving bodies (Human3.6M); and raw videos of moving humans (KTH). We compare HRI to NRI (Kipf et al. 2018), which performs relational inference but lacks a hierarchical inductive bias, and to an LSTM (Hochreiter and Schmidhuber 1997) that concatenates representations from all objects and predicts them jointly, but lacks a relational inference mechanism altogether. Appendix A contains all experimental details. Reported results are mean and standard deviations over 5 seeds. |
| Researcher Affiliation | Academia | Aleksandar Stanic, Sjoerd van Steenkiste, J urgen Schmidhuber Swiss AI Lab IDSIA, USI, SUPSI Lugano, Switzerland {aleksandar, sjoerd, juergen}@idsia.ch |
| Pseudocode | No | The paper describes its method using mathematical equations and prose but does not include any explicit pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any explicit statements or links indicating that the source code for the described methodology is publicly available. |
| Open Datasets | Yes | Additionally, we consider Human3.6M (Ionescu et al. 2013) (using rendered joints) and KTH (Schuldt, Laptev, and Caputo 2004) (using raw videos). |
| Dataset Splits | No | The paper states 'All are trained end-to-end in an unsupervised manner' and mentions training stages, but it does not provide specific percentages or counts for training, validation, and test dataset splits in the main text. |
| Hardware Specification | Yes | We are also grateful to NVIDIA Corporation for donating several DGX machines to our lab and to IBM for donating a Minsky machine. |
| Software Dependencies | No | The paper mentions types of neural networks used (CNNs, RNNs, GNNs, LSTM) and standard components, but it does not provide specific version numbers for software dependencies or libraries like Python, PyTorch, or TensorFlow in the main text. |
| Experiment Setup | No | The paper refers to 'Appendix A contains all experimental details', but the main text does not provide specific hyperparameter values (e.g., learning rate, batch size, number of epochs) or detailed system-level training configurations. |