Flatten Anything: Unsupervised Neural Surface Parameterization

Authors: Qijian Zhang, Junhui Hou, Wenping Wang, Ying He

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments demonstrate the universality, superiority, and inspiring potential of our proposed neural surface parameterization paradigm. Comprehensive experiments demonstrate the advantages of our approach over traditional state-of-the-art approaches.
Researcher Affiliation Academia Qijian Zhang1, Junhui Hou1 , Wenping Wang2, Ying He3 1Department of Computer Science, City University of Hong Kong, Hong Kong SAR, China 2Department of Computer Science and Engineering, Texas A&M University, Texas, USA 3School of Computer Science and Engineering, Nanyang Technological University, Singapore
Pseudocode No The paper describes network architectures and mathematical formulations (Eqn. 1-7) but does not contain a structured pseudocode or algorithm block.
Open Source Code Yes Our code is available at https://github.com/keeganhk/Flatten Anything.
Open Datasets No We collected a series of 3D surface models with different types and complexities of geometric and/or topological structures for experimental evaluation. The paper does not provide concrete access information (link, DOI, specific repository) for a public dataset, or formal citations for the specific models used.
Dataset Splits No The paper does not explicitly provide specific dataset split information for training, validation, or testing. It only mentions 'For the training of our FAM, we uniformly sample 10, 000 points from the original mesh vertices at each optimization iteration.'
Hardware Specification Yes All our experiments are conducted on a single NVIDIA Ge Force RTX 3090 GPU. Here, the official code of SLIM runs on the Intel(R) Core(TM) i7-9700 CPU.
Software Dependencies No The paper describes network architectures and activation functions (e.g., Leaky ReLU) but does not provide specific software dependencies with version numbers (e.g., PyTorch 1.9, Python 3.8).
Experiment Setup Yes As presented in Eqn. (8) for cutting seam extraction, we chose Kcut = 3. Suppose that, at the current training iteration, the side length of the square bounding box of 2D UV coordinates Q is denoted as L(Q). Then we set the threshold Tcut to be 2% of L(Q). Besides, as presented in Eqn. (9) for computing the unwrapping loss, we choose the threshold as ϵ = 0.2 L(Q)/ N, and Ku = 8. When formulating the overall training objective, the weights for ℓunwrap, ℓwrap, ℓcycle, and ℓconf are set as 0.01, 1.0, 0.01, and 0.01.