$p$-Poisson surface reconstruction in curl-free flow from point clouds

Authors: Yesom Park, Taekyung Lee, Jooyoung Hahn, Myungjoo Kang

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experiments on standard benchmark datasets show that the proposed INR provides a superior and robust reconstruction.
Researcher Affiliation Academia 1 Department of Mathematical Sciences, Seoul National University 2 Interdisciplinary Program in Artificial Intelligence, Seoul National University 3 Department of Mathematics and Descriptive Geometry, Slovak University of Technology in Bratislava
Pseudocode No The paper does not include a clearly labeled pseudocode or algorithm block.
Open Source Code Yes The code is available at https://github.com/Yebbi/PINC.
Open Datasets Yes We leverage two widely used benchmark datasets to evaluate the proposed model for surface reconstruction: Surface Reconstruction Benchmark (SRB) [7] and Thingi10K [65].
Dataset Splits No The paper describes random sampling of points during training but does not provide specific train/validation/test dataset splits, percentages, or explicit standard split citations for reproducibility of data partitioning.
Hardware Specification Yes We implement all numerical experiments on a single NVIDIA RTX 3090 GPU.
Software Dependencies No The paper mentions software like 'autodifferentiation library (autograd)' and 'IGR', 'SIREN', 'Di GS' codes, but does not provide specific version numbers for these software dependencies.
Experiment Setup Yes we use an 8-layer network with 512 neurons and a skip connection to the middle layer, but only the output dimension of the last layer is increased by six due to the auxiliary variables. For (13), we empirically set the loss coefficients to λ1 = 0.1, λ2 = 0.0001. λ3 = 0.0005, and λ4 = 0.1 and use p = in (7) for numerical simplicity. In all experiments, we use the Adam optimizer [32] with learning rate 10 3 decayed by 0.99 every 2000 iterations.