Higher-Order Factorization Machines
Authors: Mathieu Blondel, Akinori Fujino, Naonori Ueda, Masakazu Ishihata
NeurIPS 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We demonstrate the proposed approaches on four different link prediction tasks. 6 Experimental results |
| Researcher Affiliation | Collaboration | Mathieu Blondel, Akinori Fujino, Naonori Ueda NTT Communication Science Laboratories Japan Masakazu Ishihata Hokkaido University Japan |
| Pseudocode | Yes | Algorithm 1 Evaluating Am(p, x) in O(dm) Algorithm 2 Computing Am(p, x) in O(dm) |
| Open Source Code | No | The paper does not provide an explicit statement or a link to the open-source code for the methodology described. |
| Open Datasets | Yes | Table 2: Datasets used in our experiments. For a detailed description, c.f. Appendix A. NIPS [17] Enzyme [21] GD [10] Movielens 100K [6] |
| Dataset Splits | Yes | We split the n+ positive samples into 50% for training and 50% for testing. We sample the same number of negative samples as positive samples for training and use the rest for testing. |
| Hardware Specification | No | The paper mentions running experiments on the NIPS dataset and comparing solvers but does not provide specific details about the hardware (e.g., CPU, GPU models, memory) used for these experiments. |
| Software Dependencies | No | The paper mentions `libfm` and `TensorFlow` but does not specify version numbers for any software dependencies used in their experiments or implementation. |
| Experiment Setup | Yes | We chose β from 10^-6, 10^-5, . . . , 10^6 by cross-validation and following [9] we empirically set k = 30. Throughout our experiments, we initialized the elements of P randomly by N(0, 0.01). We set ℓ to be the squared loss. |