Disentangled Multiplex Graph Representation Learning

Authors: Yujie Mo, Yajie Lei, Jialie Shen, Xiaoshuang Shi, Heng Tao Shen, Xiaofeng Zhu

ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments verify the superiority of the proposed method in terms of different downstream tasks. ... In this section, we conduct experiments on six public datasets to evaluate the proposed method in terms of different tasks.
Researcher Affiliation Academia 1School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu, China 2Department of Computer Science, City, University of London, London, United Kingdom 3Shenzhen Institute for Advanced Study, University of Electronic Science and Technology of China, Shenzhen, China.
Pseudocode Yes Appendix A. Algorithm This section provides the pseudo-code of the proposed method. Algorithm 1 The pseudo-code of the proposed DMG.
Open Source Code Yes The code is released at https://github.com/Yujie Mo/DMG.
Open Datasets Yes The used datasets include four multiplex graph datasets and two single-view graph datasets. Multiplex graph datasets include two citation datasets (i.e., ACM (Wang et al., 2019) and DBLP (Wang et al., 2019)), two movie datasets (i.e., IMDB (Wang et al., 2019) and Freebase (Wang et al., 2021)). Single-view graph datasets include two amazon sale datasets, i.e., Photo and Computers (Shchur et al., 2018)).
Dataset Splits No The paper mentions 'training process' and 'validation' in general terms and refers to following evaluation protocols from previous works, but it does not specify explicit training/validation/test dataset splits (e.g., percentages, sample counts, or detailed splitting methodology) for their experiments.
Hardware Specification No The paper does not provide specific hardware details such as exact GPU/CPU models, processor types, or memory amounts used for running its experiments.
Software Dependencies No The paper mentions the use of 'Adam optimizer' and 'ReLU function' as well as 'GCN' and 'MLP' as components, but it does not provide specific version numbers for any software libraries, frameworks (e.g., TensorFlow, PyTorch), or programming languages used to implement or run the experiments.
Experiment Setup Yes Table 7. Settings for the proposed DMG. Settings ACM IMDB DBLP Freebase Photo Computers D 8 8 8 8 16 40 d 2 2 2 2 2 4 ω 3 3 3 3 1 1 Hidden units of g(r) 256 512 256 256 256 512 Hidden units of ϕ(r) 256 256 256 256 256 256 Hidden units of ψ(r) 256 256 256 256 256 256 Layers of p(r) 3 2 2 2 2 2 Learning rate 1e-3 1e-3 1e-3 1e-3 1e-3 1e-3 Weight decay 1e-4 1e-4 1e-4 1e-4 1e-4 1e-4 Dropout 0.1 0.1 0.1 0.1 0.1 0.1