DANE: Domain Adaptive Network Embedding

Authors: Yizhou Zhang, Guojie Song, Lun Du, Shuwen Yang, Yilun Jin

IJCAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments reflect that the proposed framework outperforms other well-recognized network embedding baselines in cross-network domain adaptation tasks.
Researcher Affiliation Academia 1School of Electronic Engineering and Computer Science, Peking University 2Key Laboratory of Machine Perception (Ministry of Education), Peking University {zhangyizhou2015, dulun, gjsong, swyang, yljin}@pku.edu.cn
Pseudocode No The paper describes the methodology using prose and mathematical equations but does not include any clearly labeled pseudocode or algorithm blocks.
Open Source Code No The paper does not contain any explicit statements about releasing source code or provide links to a code repository.
Open Datasets Yes Paper Citation Networks1 consist of two different networks A and B, where each node is a paper... 1collected from Aminer database [Tang, 2016]
Dataset Splits No The paper describes using a source network for training and a target network for testing in a domain adaptation context. However, it does not explicitly provide details for a separate validation split within these networks or for hyperparameter tuning.
Hardware Specification No The paper does not provide specific details about the hardware used for running the experiments, such as GPU or CPU models.
Software Dependencies No The paper mentions general software components like 'L2-regularized logistic regression via SGD algorithm' and 't-SNE package' but does not specify any version numbers for these or other software dependencies.
Experiment Setup Yes To be fair, for all methods we set the embedding dimension to 128 on Paper Citation Networks, and 32 on Co-author Networks. For methods applying negative sampling, we set negative sampling number as 5. For methods employing GCN, we use same activation function and 2-layer architecture.