🏘️ ProcTHOR: Large-Scale Embodied AI Using Procedural Generation
Authors: Matt Deitke, Eli VanderBilt, Alvaro Herrasti, Luca Weihs, Kiana Ehsani, Jordi Salvador, Winson Han, Eric Kolve, Aniruddha Kembhavi, Roozbeh Mottaghi
NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We demonstrate the power and potential of PROCTHOR via a sample of 10,000 generated houses and a simple neural model. Models trained using only RGB images on PROCTHOR, with no explicit mapping and no human task supervision produce state-of-the-art results across 6 embodied AI benchmarks for navigation, rearrangement, and arm manipulation... |
| Researcher Affiliation | Collaboration | PRIOR @ Allen Institute for AI, ψUniversity of Washington, Seattle |
| Pseudocode | No | The paper describes the procedural generation process through text and diagrams (e.g., Figure 2), but does not include explicit pseudocode or algorithm blocks. |
| Open Source Code | Yes | PROCTHOR will be open-sourced and the code used in this work will be released. |
| Open Datasets | Yes | We demonstrate the power and potential of PROCTHOR using a sampled set of 10,000 fully interactive houses obtained by the procedural generation process described in Section 3 which we label PROCTHOR-10K. An additional set of 1,000 validation and 1,000 testing houses are available for evaluation. |
| Dataset Splits | Yes | An additional set of 1,000 validation and 1,000 testing houses are available for evaluation. Asset splits across train/val/test are detailed in the Appendix. |
| Hardware Specification | Yes | This set of 10K houses was generated in 1 hour on a local workstation with 4 NVIDIA RTX A5000 GPUs. Experiments were run on a server with 8 NVIDIA Quadro RTX 8000 GPUs. |
| Software Dependencies | No | The paper lists numerous open-source packages used (e.g., 'Py Torch [41]', 'Num Py [23]', 'Tensor Flow [1]') but does not provide specific version numbers for these software dependencies, only citations to their original papers or general project pages. |
| Experiment Setup | Yes | All models are trained with the Allen Act [56] framework, see the Appendix for training details. |