Multi-view Knowledge Graph Embedding for Entity Alignment
Authors: Qingheng Zhang, Zequn Sun, Wei Hu, Muhao Chen, Lingbing Guo, Yuzhong Qu
IJCAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our experiments on real-world datasets show that the proposed framework significantly outperforms the state-of-the-art embeddingbased entity alignment methods. |
| Researcher Affiliation | Academia | 1 State Key Laboratory for Novel Software Technology, Nanjing University, China 2 Department of Computer Science, University of California, Los Angeles, USA |
| Pseudocode | Yes | Algorithm 1: Combined training process of Multi KE |
| Open Source Code | Yes | The source code is accessible online.2 2https://github.com/nju-websoft/Multi KE |
| Open Datasets | Yes | In our experiments, we reused two datasets, namely DBP-WD and DBP-YG, recently proposed in [Sun et al., 2018]. |
| Dataset Splits | No | The paper states '30% reference entity alignment as seed and leaves the remaining for evaluating entity alignment performance' but does not specify a distinct validation set. |
| Hardware Specification | No | The paper does not provide any specific details about the hardware used for the experiments. |
| Software Dependencies | No | The paper does not provide specific version numbers for software dependencies or libraries used for the experiments. |
| Experiment Setup | Yes | The following hyper-parameters were used in the experiments. Each training took Q = 200 epochs with learning rate 0.001. For the relation view embedding, 10 negative facts were sampled for each real relation fact. For the attribute view embedding, the number of filters was 2 and the convolution kernel size was 2 4 (i.e., c = 4). The activation function for the autoencoder and CNN was tanh( ). For the relation and attribute identity inference, we set α1 = 0.6, α2 = 0.4 and η = 0.9. The embedding dimension d was set to 75 for all the comparative methods. |