Differentiable learning of numerical rules in knowledge graphs
Authors: Po-Wei Wang, Daria Stepanova, Csaba Domokos, J. Zico Kolter
ICLR 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In this section we report the results of our experimental evaluation, which focuses on the effectiveness of our method against the state-of-art rule learning systems with respect to the predictive quality of the learned rules. Specifically, we conduct experiments on a canonical knowledge graph completion task as described in Yang et al. (2017). |
| Researcher Affiliation | Collaboration | Po-Wei Wang Machine Learning Department Carnegie Mellon University and Bosch Center for AI poweiw@cs.cmu.edu Daria Stepanova Bosch Center for AI daria.stepanova@de.bosch.com Csaba Domokos Bosch Center for AI csaba.domokos@de.bosch.com Zico Kolter Department of Computer Science Carnegie Mellon University and Bosch Center for AI zkolter@cs.cmu.edu |
| Pseudocode | No | The paper does not contain any structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper mentions that 'We have implemented our approach for learning numerical rules from knowledge graphs in python using the Py Torch library' but does not provide any link or explicit statement about making their own code open-source or publicly available. It only provides a link to a baseline's code. |
| Open Datasets | Yes | FB15K-237-num is a variant of Freebase knowledge graph with numerical values, where the reverse relations have been removed (Garc ıa-Dur an & Niepert (2018)). DBPedia15K is a fragment of the DBPedia knowledge graph Lehmann et al. (2015) restricted to numerical facts Garc ıa-Dur an & Niepert (2018). |
| Dataset Splits | Yes | We use 80% of the KG as the training set, and 10% for test set and the same for validation. |
| Hardware Specification | Yes | We have implemented our approach for learning numerical rules from knowledge graphs in python using the Py Torch library, and conducted all experiments on a machine GTX 1080 TO GPU with 11 GB RAM. |
| Software Dependencies | No | The paper states, 'We have implemented our approach for learning numerical rules from knowledge graphs in python using the Py Torch library,' but does not provide specific version numbers for Python or PyTorch. |
| Experiment Setup | Yes | The only difference between the parameters of the Neural-LP and our system is that we set the learning rate to 10 2, while in Yang et al. (2017) it is set to 10 3, but both systems are run to convergence, and this learning rate does not affect the final performance materially except for making it converge faster. In all cases, we extracted rules with a maximum length of 5. |