Reducing the Rank in Relational Factorization Models by Including Observable Patterns
Authors: Maximilian Nickel, Xueyan Jiang, Volker Tresp
NeurIPS 2014 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We show experimentally both that the proposed additive model does improve the predictive performance over pure latent variable methods and that it also reduces the required rank and therefore runtime and memory complexity significantly. |
| Researcher Affiliation | Collaboration | 1LCSL, Poggio Lab, Massachusetts Institute of Technology, Cambridge, MA, USA 2Istituto Italiano di Tecnologia, Genova, Italy 3Ludwig Maximilian University, Munich, Germany 4Siemens AG, Corporate Technology, Munich, Germany |
| Pseudocode | No | The paper describes the computational steps and updates for the model using mathematical equations and prose, but it does not include a distinct pseudocode block or algorithm box. |
| Open Source Code | No | The paper does not provide any statement or link regarding the public availability of its source code. |
| Open Datasets | Yes | Links and references for the datasets used in the evaluation are provided in the supplementary material A.5. |
| Dataset Splits | Yes | Following [10, 11, 28, 21] we used k-fold cross-validation for the evaluation, partitioning the entries of the adjacency tensor into training, validation, and test sets. |
| Hardware Specification | Yes | We used the variant of Cora... on a moderate PC with Intel(R) Core i5 @3.1GHz, 4G RAM. |
| Software Dependencies | No | The paper does not provide specific software dependencies with version numbers. |
| Experiment Setup | Yes | The optimal rank 220 for RESCAL was determined out of the range r10,300s via parameter selection. For ARE we used a significantly smaller rank 20. |