Knowledge Graph Completion by Intermediate Variables Regularization

Authors: Changyi Xiao, Yixin Cao

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Finally, we conduct experiments to verify the effectiveness of our regularization technique as well as the reliability of our theoretical analysis.
Researcher Affiliation Academia Changyi Xiao, Yixin Cao School of Computer Science, Fudan University changyi_xiao@fudan.edu.cn, caoyixin2011@gmail.com
Pseudocode Yes Algorithm 1 A pseudocode for IVR
Open Source Code Yes The code is available at https://github.com/changyi7231/IVR.
Open Datasets Yes We evaluate the models on three KGC datasets, WN18RR [Dettmers et al., 2018], FB15k237 [Toutanova et al., 2015] and YAGO3-10 [Dettmers et al., 2018].
Dataset Splits Yes We use the filtered MRR and Hits@N (H@N) [Bordes et al., 2013] as evaluation metrics and choose the hyper-parameters with the best filtered MRR on the validation set.
Hardware Specification Yes The time is the AMD Ryzen 7 4800U CPU running time on the test set.
Software Dependencies No We use Adagrad [Duchi et al., 2011] with learning rate 0.1 as the optimizer. While Adagrad is specified, no version numbers for this optimizer or other software libraries (e.g., Python, PyTorch/TensorFlow) are provided.
Experiment Setup Yes We use Adagrad [Duchi et al., 2011] with learning rate 0.1 as the optimizer. We set the batch size to 100 for WN18RR dataset and FB15k-237 dataset and 1000 for YAGO3-10 dataset. We train the models for 200 epochs. The settings for total embedding dimension D and number of parts P are shown in Table 5. The settings for power α and regularization coefficients λi are shown in Table 6.