Contrast with Reconstruct: Contrastive 3D Representation Learning Guided by Generative Pretraining
Authors: Zekun Qi, Runpei Dong, Guofan Fan, Zheng Ge, Xiangyu Zhang, Kaisheng Ma, Li Yi
ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | RECON achieves a new state-of-the-art in 3D representation learning, e.g., 91.26% accuracy on Scan Object NN. |
| Researcher Affiliation | Collaboration | 1Xi an Jiaotong University 2IIISCT 3MEGVII Technology 4Tsinghua University 5Shanghai AI Laboratory 6Shanghai Qi Zhi Institute. |
| Pseudocode | No | No explicitly labeled 'Pseudocode' or 'Algorithm' block, figure, or section was found. The methodology is described in text and mathematical formulas. |
| Open Source Code | Yes | Codes have been released at https://github.com/qizekun/ReCon. |
| Open Datasets | Yes | Shape Net (Chang et al., 2015) is used to pretrain RECON, which contains 51K unique 3D CAD models covering 55 object categories. |
| Dataset Splits | No | No explicit statement of training/validation/test dataset splits (e.g., percentages, sample counts) was found. The paper references standard benchmark datasets (Scan Object NN, Model Net40) and mentions using their 'test split' for evaluation, but does not detail the splits themselves. |
| Hardware Specification | Yes | GPU device PH402 SKU 200 RTX 2080Ti RTX 2080Ti RTX 2080Ti |
| Software Dependencies | No | No specific software dependencies with version numbers (e.g., 'PyTorch 1.9', 'CUDA 11.1') were explicitly stated. |
| Experiment Setup | Yes | Table 7. Training recipes for pretraining and downstream fine-tuning. Config Shape Net Scan Object NN Model Net Shape Net Part optimizer Adam W Adam W Adam W Adam W learning rate 5e-4 2e-5 1e-5 2e-4 weight decay 5e-2 5e-2 5e-2 5e-2 learning rate scheduler cosine cosine cosine cosine training epochs 300 300 300 300 warmup epochs 10 10 10 10 batch size 128 32 32 16 drop path rate 0.1 0.2 0.2 0.1 |