Domain Adaptive Classification on Heterogeneous Information Networks
Authors: Shuwen Yang, Guojie Song, Yilun Jin, Lun Du
IJCAI 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments on pairwise datasets endorse not only our model s performance on domain adaptive classification on HINs and contributions by individual components. We evaluate Mu SDAC quantitatively on three pairs of networks where Mu SDAC outperforms various baselines on transferable classification. We also carry out model analysis and visualization to verify contributions of individual model components. |
| Researcher Affiliation | Collaboration | Shuwen Yang1 , Guojie Song1 , Yilun Jin2 and Lun Du3 1Key Laboratory of Machine Perception (Ministry of Education), Peking University, China 2The Hong Kong University of Science and Technology, Hong Kong SAR, China 3Microsoft Research, China {swyang, gjsong}@pku.edu.cn, yilun.jin@connect.ust.hk, lun.du@microsoft.com |
| Pseudocode | Yes | Algorithm 1 Heuristic Combination Sampling Algorithm |
| Open Source Code | Yes | Code available on https://github.com/PKUterran/Mu SDAC |
| Open Datasets | Yes | Datasets We sample pairs of structurally different graphs respectively from ACM [Kong et al., 2012], AMiner and DBLP [Wang et al., 2019]. The explicit description of datasets is on https://github.com/PKUterran/Mu SDAC/blob/master/data/DATA.md. |
| Dataset Splits | No | The paper mentions using 'ACM', 'AMiner', and 'DBLP' datasets but does not provide specific details about train/validation/test splits (e.g., percentages, sample counts, or a standard split reference within the paper's main text). |
| Hardware Specification | No | The paper does not provide any specific hardware details such as CPU/GPU models, memory, or other compute infrastructure used for the experiments. |
| Software Dependencies | No | The paper does not list any specific software dependencies with version numbers (e.g., Python 3.x, PyTorch 1.x, CUDA x.x). |
| Experiment Setup | Yes | In Mu SDAC and its variants, the dimensionality of the first and second hidden layers of the multichannel GCN is 64 and 32 respectively, before aggregated to 16 in the aggregated channel. The number of sampled combinations |Z| = M = 2N 1. In DAC, we use 5 Gaussian kernels for MMD and γ = 10. In weighted voting, we take η = 25 and α = 0.95. |