Factorizable Graph Convolutional Networks

Authors: Yiding Yang, Zunlei Feng, Mingli Song, Xinchao Wang

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We evaluate the proposed Factor GCN both qualitatively and quantitatively on the synthetic and real-world datasets, and demonstrate that it yields truly encouraging results in terms of both disentangling and feature aggregation. ... In this section, we show the effectiveness of the proposed Factor GCN, and provide discussions on its various components as well as the sensitivity with respect to the key hyper-parameters.
Researcher Affiliation Academia Yiding Yang Stevens Institute of Technology yyang99@stevens.edu Zunlei Feng Zhejiang University zunleifeng@zju.edu.cn Mingli Song Zhejiang University brooksong@zju.edu.cn Xinchao Wang Stevens Institute of Technology xinchao.wang@stevens.edu
Pseudocode No The paper describes the architecture and steps (Disentangling, Aggregation, Merging) in text and with an illustrative diagram (Figure 1), but does not provide structured pseudocode or an algorithm block.
Open Source Code Yes Code is publicly available at https://github.com/ihollywhy/Factor GCN.Py Torch.
Open Datasets Yes The second one is the ZINC dataset [31] built from molecular graphs. The third one is Pattern dataset [31], which is a large scale dataset for node classification task. The other three are widely used graph classification datasets include social networks (COLLAB,IMDB-B) and bioinformatics graph (MUTAG) [32].
Dataset Splits Yes The test results are obtained using the model with the best performance on validation set. For the other three datasets, three layers Factor GCN is used. The same 10-fold evaluation protocol as [21] is adopted.
Hardware Specification No No specific hardware details (e.g., GPU/CPU models, memory amounts, or detailed computer specifications) used for running the experiments are provided in the paper.
Software Dependencies No The paper mentions 'PyTorch' in the GitHub link URL but does not specify a version number. No other software dependencies with specific version numbers are provided.
Experiment Setup Yes For the synthetic dataset, Adam optimizer is used with a learning rate of 0.005, the number of training epochs is set to 80, the weight decay is set to 5e-5. ... The weight for the loss of discriminator in Factor GCN is set to 0.5. For the molecular dataset, the dimension of the hidden feature is set to 144 for all methods and the number of layers is set to four. Adam optimizer is used with a learning rate of 0.002. No weight decay is used. λ of Factor GCN is set to 0.2. All the methods are trained for 500 epochs.