Dynamic Topic Models for Temporal Document Networks
Authors: Delvin Ce Zhang, Hady Lauw
ICML 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments on four dynamic document networks demonstrate the advantage of our models in jointly modeling document dynamics and network adjacency. The goal of experiments is to evaluate the topics learned by our models against baselines by evaluation tasks, such as document classification, link prediction, topic analysis, etc. |
| Researcher Affiliation | Academia | Delvin Ce Zhang 1 Hady W. Lauw 1 1School of Computing and Information Systems, Singapore Management University, 80 Stamford Road, Singapore. |
| Pseudocode | Yes | Algorithm 1 Time-Aware Sinkhorn Iteration |
| Open Source Code | No | The paper does not provide concrete access to source code (specific repository link, explicit code release statement, or code in supplementary materials) for the methodology described in this paper. |
| Open Datasets | Yes | Datasets. Cora (Mc Callum et al., 2000)... Machine Learning (ML)... Programming Language (PL)... HEP-TH (Leskovec et al., 2005)... Web (Leskovec et al., 2009). |
| Dataset Splits | Yes | We split documents before timestamp T (inclusive) for training (10% are for validation). |
| Hardware Specification | Yes | Experiments were done on a Tesla K80 GPU with 11441Mi B. |
| Software Dependencies | No | The paper mentions using '300D Glo Ve embeddings' and 'Google Web 1T 5-gram Version 1' but does not specify any software dependencies (e.g., libraries, frameworks) with version numbers required to replicate the experiments. |
| Experiment Setup | Yes | For our models, ηOT = ηHP = ηp = 1 after searching in [0.5, 1, 2, 4, 10]. Dropout rate is 0.75, γ = 20, M = 5. |