FAITH: Few-Shot Graph Classification with Hierarchical Task Graphs
Authors: Song Wang, Yushun Dong, Xiao Huang, Chen Chen, Jundong Li
IJCAI 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments on four prevalent few-shot graph classification datasets demonstrate the superiority of FAITH over other state-of-the-art baselines. |
| Researcher Affiliation | Academia | 1University of Virginia 2Hong Kong Polytechnic University {sw3wv, yd6eb, zrh6du, jundong}@virginia.edu, xiaohuang@comp.polyu.edu.hk |
| Pseudocode | No | The paper does not contain explicit pseudocode or algorithm blocks. |
| Open Source Code | Yes | Codes and data are available at https://github.com/Song W-SW/FAITH. |
| Open Datasets | Yes | We follow the work of [Chauhan et al., 2020] to evaluate our framework on four processed graph classification datasets, Letter-high, ENZYMES, TRIANGLES and Reddit-12K. ... Codes and data are available at https://github.com/Song W-SW/FAITH. |
| Dataset Splits | Yes | We follow the setting of [Chauhan et al., 2020] to split the classes in each dataset into training classes Yt and test classes Yf. We specify K {5, 10} and Q = 10, where K is the number of labeled graph samples for each class, and Q is the number of unlabeled graph samples in each task. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models) used for running experiments. |
| Software Dependencies | No | The paper mentions software like PyTorch, Adam, GCN, and GIN, but does not provide specific version numbers for these dependencies. |
| Experiment Setup | Yes | The dimension of GCN [Kipf and Welling, 2017] used in the hierarchical task graph is set as Ds = Dp = Dt = 300. We utilize a 5-layer GIN [Xu et al., 2019] with the hidden dimension D = 128 as the embedding model GNNe. For the model optimization, we adopt Adam [Kingma and Ba, 2015] with a learning rate of 0.001, a dropout rate of 0.5, and the loss weight α = 1. |