GEOMetrics: Exploiting Geometric Structure for Graph-Encoded Objects
Authors: Edward Smith, Scott Fujimoto, Adriana Romero, David Meger
ICML 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our proposed method is evaluated on the task of 3D object reconstruction from images with the Shape Net dataset, where we demonstrate state of the art performance, both visually and numerically, while having far smaller space requirements by generating adaptive meshes. |
| Researcher Affiliation | Collaboration | 1Department of Computer Science, Mc Gill University, Montreal, Canada 2Mila Qu ebec AI Institute 3Facebook AI Research. |
| Pseudocode | No | The paper states "An algorithmic description of the entire process is provided in the supplementary material." and "an algorithmic description can be found in the supplementary material." indicating pseudocode is not in the main paper. |
| Open Source Code | Yes | Code for our system is publicly available on a Git Hub repository, to ensure reproducible experimental comparison.1 1https://github.com/Edward Smith1884/GEOMetrics |
| Open Datasets | Yes | We evaluate on this task across 13 classes of the Shape Net (Chang et al., 2015) dataset. |
| Dataset Splits | Yes | The data in each class was then split into a training, validation and test set with a ratio of 70:10:20, respectively. |
| Hardware Specification | No | The paper does not explicitly mention specific hardware details like GPU or CPU models used for running its experiments. |
| Software Dependencies | No | The paper mentions "Adam optimizer" and "deep CNN" but does not provide specific software library names with version numbers for reproducibility. |
| Experiment Setup | Yes | We train the full system on each class in our dataset with Adam optimizer (Kingma & Ba, 2014), at learning rate of 10 4 for 300k iterations, and then again for 150k iterations at a learning rate of 10 5, with minibatch size of 5. ... The hyper-parameter settings used, as described in Eq. (10), are γ1 = .001, γ2 = 1, γ3 = 0.3, and γ4 = 1. |