Towards Learning Geometric Eigen-Lengths Crucial for Fitting Tasks

Authors: Yijia Weng, Kaichun Mo, Ruoxi Shi, Yanchao Yang, Leonidas Guibas

ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We explore potential solutions and demonstrate the feasibility of learning eigen-lengths from simply observing successful and failed fitting trials. We also attempt geometric grounding for more accurate eigen-length measurement and study the reusability of the learned eigen-lengths across multiple tasks.
Researcher Affiliation Collaboration 1Stanford University 2NVIDIA Research 3Shanghai Jiaotong University 4HKU.
Pseudocode No The paper describes network architectures and processes, but does not include any clearly labeled pseudocode or algorithm blocks.
Open Source Code No The paper mentions a 'Project page: https://yijiaweng.github.io/geoeigen-length', but this is typically a demonstration or information page and not explicitly stated to host the source code for the methodology described in the paper.
Open Datasets Yes For objects to be fitted in tasks (a)-(e), we use 1200 common household object models from 8 training and 4 testing categories in Shape Net (Chang et al., 2015), following Mo et al. (2021b).
Dataset Splits No The paper states 'we generated 75k training and 20k testing environment-object pairs' but does not explicitly mention a separate validation split or its size.
Hardware Specification Yes All experiments are run on a single NVIDIA TITAN X GPU.
Software Dependencies No The paper states 'All networks are implemented using Py Torch', but it does not provide a specific version number for PyTorch or any other software dependency.
Experiment Setup Yes All networks are implemented using Py Torch and optimized by the Adam optimizer, with a learning rate starting at 10 3 and decay by half every 10 epochs. Each batch contains 32 data points; each epoch contains around 1600 batches. We train models for 100 epochs on all tasks.