Node Embedding over Temporal Graphs
Authors: Uriel Singer, Ido Guy, Kira Radinsky
IJCAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We evaluate the effectiveness of our approach over a variety of temporal graphs for the two fundamental tasks of temporal link prediction and multi-label node classification, comparing to competitive baselines and algorithmic alternatives. Our algorithm shows performance improvements across many of the datasets and baselines and is found particularly effective for graphs that are less cohesive, with a lower clustering coefficient. |
| Researcher Affiliation | Collaboration | Uriel Singer1 , Ido Guy2 and Kira Radinsky1 1Technion, Israel Institute of Technology 2e Bay Research |
| Pseudocode | No | The paper provides architectural diagrams (e.g., Figure 1) but does not include any clearly labeled pseudocode or algorithm blocks with structured steps. |
| Open Source Code | Yes | 1We publicly publish our code and data: https://github.com/urielsinger/t Node Embed |
| Open Datasets | Yes | We publicly release this new temporal graph.1 |
| Dataset Splits | No | For link prediction, the first six datasets in Table 1 were used, while for node classification the remaining two datasets, Cora and DBLP, which include node labels, were used. For Temporal Link Prediction, we divided the data into train and test by selecting a pivot time, such that 80% of the edges in the graph (or the closest possible portion) have a timestamp earlier or equal to the pivot. For Multi-Label Node Classification, we randomly split the entire dataset so that 80% of the nodes are used for training. No explicit mention of a validation set split is provided. |
| Hardware Specification | No | The paper does not provide specific details about the hardware used for running the experiments, such as GPU models, CPU types, or memory specifications. |
| Software Dependencies | No | The paper mentions software like 'node2vec' and utilizes 'LSTM' within its architecture but does not provide specific version numbers for any software dependencies or libraries used in the experiments. |
| Experiment Setup | No | The paper describes the overall framework and model architecture but does not provide specific experimental setup details such as hyperparameter values (e.g., learning rate, batch size, number of epochs) or specific optimizer settings. |