Learning Graph Representations with Embedding Propagation

Authors: Alberto Garcia Duran, Mathias Niepert

NeurIPS 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We evaluate EP with the following six commonly used benchmark data sets. ... Using the node classification data sets, we compare the performance of EP-B to the state of the art approaches...
Researcher Affiliation Industry Alberto García-Durán NEC Labs Europe Heidelberg, Germany alberto.duran@neclab.eu Mathias Niepert NEC Labs Europe Heidelberg, Germany mathias.niepert@neclab.eu
Pseudocode No No pseudocode or algorithm blocks were found in the paper.
Open Source Code No The paper does not provide an explicit statement or link for the open-sourcing of the authors' own implementation code.
Open Datasets Yes We evaluate EP with the following six commonly used benchmark data sets. Blog Catalog [46]... PPI [6]... POS [28]... Cora, Citeseer and Pubmed [40]...
Dataset Splits Yes For the graphs with attributes (Cora, Citeseer, Pubmed) we follow the same experimental procedure as in previous work [45]. We sample 20 nodes uniformly at random for each class as training data, 1000 nodes as test data, and a different 1000 nodes as validation data.
Hardware Specification Yes All experiments were run on commodity hardware with 128GB RAM, a single 2.8 GHz CPU, and a Titan X GPU.
Software Dependencies No EP was implemented with the Theano [4] wrapper Keras [9]. We used the logistic regression classifier from Lib Linear [10]. Specific version numbers for Theano, Keras, or Lib Linear are not provided.
Experiment Setup Yes The dimension of the embeddings is always fixed to 128. For EP-B, we chose the margin γ in (3) from the set of values [1, 5, 10, 20] on validation data. For EP-B we used ADAM [17] to learn the parameters in a mini-batch setting with a learning rate of 0.001. A single learning epoch iterates through all nodes of the input graph and we fixed the number of epochs to 200 and the mini-batch size to 64.