Scalable Model Selection for Large-Scale Factorial Relational Models

Authors: Chunchen Liu, Lu Feng, Ryohei Fujimaki, Yusuke Muraoka

ICML 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Empirical results show the superiority of s FAB/BMF in both accuracy and scalability over state-of-the-art inference methods for overlapping relational models. and Section 5. Experiments.
Researcher Affiliation Industry Chunchen Liu LIU CHUNCHEN@NEC.CN Lu Feng FENG LU@NEC.CN NEC Laboratories China Ryohei Fujimaki RFUJIMAKI@NEC-LABS.COM Yusuke Muraoka YMURAOKA@NEC-LABS.COM NEC Knowledge Discovery Research Laboratories
Pseudocode Yes Algorithm 1 stochastic FAB for BMFs
Open Source Code No Explanation: The paper states that the authors implemented their own methods in C++ and MATLAB but does not provide a statement of release or a link to their source code. It only links to external baseline implementations.
Open Datasets Yes Further, eight real network datasets (namely, Zachary s Karate Club (Zachary, 1977), summer school survey network5, U.S. Political Books6, NIPS coauthorship network (Globerson et al., 2007), Facebook, autonomous systems, the collaboration network of Arxiv Astro Physics (Astro Ph for short), and Arxiv High Energy Physics (Hep Ph for short)7) with different scales (node numbers ranged from 34 to 10K) were used to evaluate s FAB/BMF and FAB/BMF. and 7http://snap.stanford.edu/data
Dataset Splits Yes In the following experiments, we utilized 10-fold cross validation, each time holding out a different 10% of the data (links and non-links).
Hardware Specification No Explanation: The paper does not provide any specific details about the hardware used for running the experiments, such as CPU/GPU models, memory, or cloud computing specifications.
Software Dependencies No Explanation: The paper mentions programming languages (C++, MATLAB) and specific software packages (SVINET, ILA) but does not provide any version numbers for these software dependencies.
Experiment Setup Yes In all the simulation experiments below, we set the initial K as 40 for s FAB/BMF and FAB/BMF, and performed inference with K = 2, , 40 for VB/BMF and SVINET. and The learning rate for s FAB/BMF was set to 0.5 for small datasets (N < 1000) and 0.2 for large datasets (N 1000), in consideration of the balance between accuracy and efficiency. and SVINET set the mini-batch to the entire set of links and used a learning rate of 1. and We followed the experimental setting of the original paper of ILA (Palla et al., 2012) and ran 500 MCMC iterations for ILA and 1000 iterations for MCMC/BMF. s FAB/BMF, FAB/BMF, and SVINET all employed δ = 1 10 5 as the optimization tolerance.