Learning Part Generation and Assembly for Structure-Aware Shape Synthesis
Authors: Jun Li, Chengjie Niu, Kai Xu11362-11369
AAAI 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We demonstrate through both qualitative and quantitative evaluations that PAGENet generates 3D shapes with plausible, diverse and detailed structure |
| Researcher Affiliation | Academia | Jun Li, Chengjie Niu, Kai Xu* National University of Defense Technology |
| Pseudocode | No | The paper describes the network architecture and training details in text and figures, but does not provide structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not explicitly state that source code for the methodology is available or provide a link to a code repository. |
| Open Datasets | Yes | We train and test our model on the Shape Net part dataset (Yi et al. 2016), which is a subset of the Shape Net dataset (Chang et al. 2015) and provides consistent alignment and semantic labeling for all shapes. |
| Dataset Splits | Yes | The dataset is divided into two parts, according to the official training/test split. |
| Hardware Specification | No | The paper mentions training times ('average training time is 12 hours for each part generator and 7 hours for part assembler') but does not specify any hardware details like GPU or CPU models. |
| Software Dependencies | No | The paper mentions using ADAM for optimization and WGAN-GP for adversarial training, but does not provide specific version numbers for any software, libraries, or frameworks used. |
| Experiment Setup | Yes | For all modules, we use ADAM (β = 0.5) for network optimization with an initial learning rate of 0.001. Batch size is set to 32. The parameters in the loss in Equation (1) are set as α1 = 2 and α2 = 1 10 3 for all experiments. The λ in Equation (2) is set to 10 as in (Gulrajani et al. 2017). |