Duality-Induced Regularizer for Tensor Factorization Based Knowledge Graph Completion

Authors: Zhanqiu Zhang, Jianyu Cai, Jie Wang

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In this section, we introduce the experimental settings in Section 5.1 and show the effectiveness of DURA in Section 5.2. We compare DURA to other regularizers in Section 5.3 and visualize the entity embeddings in Section 5.4. Finally, we analyze the sparsity induced by DURA in Section 5.5.
Researcher Affiliation Academia University of Science and Technology of China {zzq96,jycai}@mail.ustc.edu.cn,jiewangx@ustc.edu.cn
Pseudocode No The paper describes mathematical formulations and derivations but does not include any pseudocode or algorithm blocks.
Open Source Code Yes The code of HAKE is available on Git Hub at https://github.com/MIRALab-USTC/KGE-DURA.
Open Datasets Yes We consider three public knowledge graph datasets WN18RR [27], FB15k-237 [6], and YAGO3-10 [17] for the knowledge graph completion task, which have been divided into training, validation, and testing set in previous works.
Dataset Splits Yes We consider three public knowledge graph datasets WN18RR [27], FB15k-237 [6], and YAGO3-10 [17] for the knowledge graph completion task, which have been divided into training, validation, and testing set in previous works. The statistics of these datasets are shown in Table 1.
Hardware Specification No The paper does not provide specific details about the hardware used for running experiments (e.g., GPU/CPU models, memory specifications).
Software Dependencies No The paper does not explicitly state specific software dependencies or their version numbers.
Experiment Setup Yes We search λ in {0.005, 0.01, 0.05, 0.1, 0.5} and λ1, λ2 in {0.5, 1.0, 1.5, 2.0}. We reimplement CP, Dist Mult, Compl Ex, and RESCAL using the reciprocal setting [15, 14].