Isolation Graph Kernel
Authors: Bi-Cun Xu, Kai Ming Ting, Yuan Jiang10487-10495
AAAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | The experiments have two aims: (i) Examine the role of Isolation Kernel in IGK; and (ii) Compare the relative performance of IGK and existing graph kernels in terms of classification accuracy and runtime. To achieve the first aim, the following variants of IGK (K(1,2,3)) are used in an ablation study: ... To achieve the second aim, three existing graph kernels (which performed the best in Togninalli et al. (2019)) are used in the comparison: ... The accuracy results of the comparison are shown in Table 1. |
| Researcher Affiliation | Academia | Bi-Cun Xu, Kai Ming Ting, Yuan Jiang National Key Laboratory for Novel Software Technology, Nanjing University, Nanjing 210023, China {xubc,tingkm,jiangy}@lamda.nju.edu.cn |
| Pseudocode | Yes | Algorithm 1: Generate IGK mapped dataset Input : Dataset {Gi = (Vi, Ei), i = 1, . . . , n}, IK parameter ψ, WL parameter h. Output: IGK mapped dataset {bΦ(Gi), i = 1, . . . , n}. |
| Open Source Code | Yes | The IGK code is available at github.com/Isolation Kernel. |
| Open Datasets | Yes | We use 13 benchmark datasets commonly used to evaluate graph kernels (Kersting et al. 2016). |
| Dataset Splits | Yes | 10-fold CV SVM classifier results in accuracy(%). |
| Hardware Specification | Yes | The machine used in the experiments has 755G memory and Intel E5 CPU. |
| Software Dependencies | No | The paper mentions 'SVM classifiers in scikit-learn', 'Liblinear', and 'Libsvm' but does not specify their version numbers. |
| Experiment Setup | Yes | The parameter search ranges of all these methods in the experiments are given in Table 3. Algorithm Parameter search ranges IGK h = {0, . . . , 7}; ψ = {24, . . . , 211}; t = 100 |