Assembly Fuzzy Representation on Hypergraph for Open-Set 3D Object Retrieval

Authors: Yang Xu, Yifan Feng, Jun Zhang, Jun-Hai Yong, Yue Gao

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments and ablation studies on these three benchmarks show our method outperforms current state-of-the-art methods.
Researcher Affiliation Collaboration Yang Xu1, Yifan Feng1, Jun Zhang2, Jun-Hai Yong1, and Yue Gao1 1BNRist, THUIBCS, KLISS, BLBCI, School of Software, Tsinghua University, China 2Tencent AI Lab
Pseudocode Yes Algorithm 1 Training the IAE module; Algorithm 2 Training the SFR module.
Open Source Code No We are in the preparation of the codes, models, and datasets, and they will be released at the earliest opportunity.
Open Datasets Yes We construct three datasets for assembly-based open-set 3D object retrieval (Open Part datasets), including OP-SHNP, OP-INTRA, and OP-COSEG based on the public dataset Shape Net Part [42], Intr A [41], and COSEG [34].
Dataset Splits No The paper specifies training and testing sets, but does not explicitly mention or provide details for a separate validation dataset split.
Hardware Specification Yes Our experiments were conducted on a Tesla V100-32G GPU and an Intel(R) Xeon(R) Silver 4210 CPU @ 2.20GHz.
Software Dependencies No The paper mentions "Point Net [28]" for feature extraction, but does not provide specific version numbers for any software, libraries, or frameworks used in the implementation.
Experiment Setup Yes We set α = 0.5 in LIAE and β = 0.9 in LSF R. The IAE is trained for 40 epochs on learning rate lr = 0.1, and the SFR is trained for 30 epochs on lr = 0.001. The hyper-parameters k in the SFR module are set to 20, 6, and 40 for OP-SHNP, OP-INTRA, and OP-COSEG, respectively. Table 8 details Optimizer (SGD), Learning Rate, Momentum, Weight Decay, LR Scheduler (Cosine Annealing), Tmax, etamin, and Max Epoches for both IAE and SFR.