End-to-End Differentiable Physics for Learning and Control
Authors: Filipe de Avila Belbute-Peres, Kevin Smith, Kelsey Allen, Josh Tenenbaum, J. Zico Kolter
NeurIPS 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Through experiments in diverse domains, we highlight the system s ability to learn physical parameters from data, efficiently match and simulate observed visual behavior, and readily enable control via gradient-based planning methods. Code for the engine and experiments is included with the paper.1 |
| Researcher Affiliation | Collaboration | Filipe de A. Belbute-Peres School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213 filiped@cs.cmu.edu Kevin A. Smith Brain and Cognitive Sciences Massachusetts Institute of Technology Cambridge, MA 02139 k2smith@mit.edu Kelsey R. Allen Brain and Cognitive Sciences Massachusetts Institute of Technology Cambridge, MA 02139 krallen@mit.edu Joshua B. Tenenbaum Brain and Cognitive Sciences Massachusetts Institute of Technology Cambridge, MA 02139 jbt@mit.edu J. Zico Kolter School of Computer Science Carnegie Mellon University and Bosch Center for Artificial Intelligence Pittsburgh, PA 15213 zkolter@cs.cmu.edu |
| Pseudocode | No | The paper does not contain structured pseudocode or algorithm blocks in the main text. It refers to appendices for detailed descriptions ('A detailed description of the physics engine architecture is presented in Appendix A due to space constraints. The LCP solution and the gradients are presented in more detail in Appendix B.'), but these are not within the scope of the main text analysis for pseudocode. |
| Open Source Code | Yes | Code for the engine and experiments is available at https://github.com/locuslab/lcp-physics. |
| Open Datasets | No | The paper states 'To test our approach on a benchmark for visual physical prediction, we generated a dataset of videos of billiard ball-like scenarios using the code from [Fragkiadaki et al., 2015].' It does not provide concrete access information (link, DOI, repository, or specific citation for access) for this generated dataset. While it mentions using OpenAI Gym environments, these are environments, not the specific training dataset used for the experiments. |
| Dataset Splits | Yes | Simulations lasting 10 seconds were generated, totalling 8,000 trials for training, 1,000 for validation and 1,000 for testing. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., CPU/GPU models, memory amounts, or detailed computer specifications) used for running its experiments. It only mentions implementation in PyTorch and using 'autograd automatic differentiation graph functionality' without hardware specifics. |
| Software Dependencies | No | The paper mentions that 'The physics engine is implemented in Py Torch [Paszke et al., 2017]', but it does not specify a version number for PyTorch or any other software dependencies, which is required for reproducibility. |
| Experiment Setup | Yes | Gradients are clipped to a maximum absolute value of 100 and then used to perform gradient descent on the value of the mass, with a learning rate of 0.01. ... The architecture used is a mirror of the VGG encoder network, with transposed convolutions in the place of convolutions and bilinear upsampling layers in the place of the maxpooling ones. ... The i LQR is set up with a time-horizon of 5 frames for both tasks. |