Analogical Inference Enhanced Knowledge Graph Embedding

Authors: Zhen Yao, Wen Zhang, Mingyang Chen, Yufeng Huang, Yi Yang, Huajun Chen

AAAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Through extensive experiments on FB15k-237 and WN18RR datasets, we show that An KGE achieves competitive results on link prediction task and well performs analogical inference.
Researcher Affiliation Collaboration Zhen Yao1*, Wen Zhang1*, Mingyang Chen2, Yufeng Huang1, Yi Yang4, Huajun Chen2,3,5 1School of Software Technology, Zhejiang University 2College of Computer Science and Technology, Zhejiang University 3Donghai Laboratory, Zhoushan 316021, China 4Huawei Technologies Co., Ltd 5Alibaba-Zhejiang University Joint Institute of Frontier Technologies
Pseudocode No The paper describes procedures and functions but does not include explicitly labeled pseudocode or algorithm blocks.
Open Source Code Yes Our code is available at https://github.com/zjukg/An KGE
Open Datasets Yes We conduct experiments on link prediction task on two well-known benchmarks: WN18RR and FB15k-237. WN18RR and FB15k-237 are subsets of WN18 and FB15k, respectively.
Dataset Splits Yes Following the filter setting protocol, we exclude the other true triples appearing in train, valid and test datasets.
Hardware Specification No The paper does not specify the hardware used for experiments, such as particular CPU or GPU models.
Software Dependencies No The paper mentions 'Python' in Appendix D but does not provide specific version numbers for Python or any libraries, frameworks, or solvers used.
Experiment Setup Yes We search the number of analogy objects of three levels Ne, Nr and Nt {1, 3, 5, 10, 20}, the basic weight of three levels αE, αR and αT {0.01, 0.05, 0.1, 0.2, 0.3}, learn rate α {1e 3, 1e 4, 1e 5}. The loss function weight γ in Equation (10) is set to 10, the transformation matrix weight λ in Equation (5) is set to 1 and 0 in FB15k-237 and WN18RR respectively.