Deep SE(3)-Equivariant Geometric Reasoning for Precise Placement Tasks
Authors: Ben Eisner, Yi Yang, Todor Davchev, Mel Vecerik, Jonathan Scholz, David Held
ICLR 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We demonstrate that our method can yield substantially more precise predictions in simulated placement tasks than previous methods trained with the same amount of data, and can accurately represent relative placement relationships data collected from real-world demonstrations. |
| Researcher Affiliation | Collaboration | Ben Eisner Carnegie Mellon University Yi Yang, Todor Davchev, Mel Veceric, Jon Scholz Google Deep Mind David Held Carnegie Mellon University |
| Pseudocode | Yes | Algorithm 1 Multilateration (MUL); Algorithm 2 Least-squares solution to the Procrustes problem (PRO) |
| Open Source Code | Yes | Supplementary information and videos can be found at this URL. |
| Open Datasets | Yes | We select 5 manipulation tasks tasks from the RLBench benchmark (James et al. (2020)...); we evaluate our method on the NDF relative placement tasks, as proposed in Simeonov et al. (2022). |
| Dataset Splits | No | The paper describes training and testing procedures but does not explicitly mention a separate validation split with specific percentages or counts. |
| Hardware Specification | No | No specific hardware details (e.g., GPU/CPU models, memory) used for running experiments are provided. Figure 9 shows a robot arm, but no specifications for the computing hardware are given. |
| Software Dependencies | No | The paper mentions 'Pytorch implementations' of network architectures but does not specify version numbers for PyTorch or any other software dependencies. |
| Experiment Setup | Yes | In our results, we report a model where K = 256... We use identical supervision as proposed in Pan et al. (2023), combining the Point Displacement Loss, Direct Correspondence Loss, and Correspondence Consistency Loss. See Pan et al. (2023) for details. |