Quantum Positional Encodings for Graph Neural Networks

Authors: Slimane Thabet, Mehdi Djellabi, Igor Olegovich Sokolov, Sachin Kasture, Louis-Paul Henry, Loic Henriet

ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Empirically, we show that the performance of state-of-the-art models can be improved on standard benchmarks and large-scale datasets by computing tractable versions of quantum features. Our findings highlight the potential of leveraging quantum computing capabilities to enhance the performance of transformers in handling graph data.Sec. 5 presents the outcomes of our numerical experiments and includes discussions of the results.
Researcher Affiliation Collaboration 1Pasqal, Massy, France 2Sorbonne University, Paris, France.
Pseudocode Yes A.2. Algorithms In this appendix, we detail the algorithms for the approach described in this paper. Firstly, we give a generic algorithm that allows the simulation of quantum features on a classical computer, according to an arbitrary choice of Hamiltonian and quantum observables. Algorithm 1 Positional encoding with a generic classical simulation of a Hamiltonian evolution
Open Source Code Yes The code to run all the experiments is available at https://github.com/pasqal-io/quantum-encodings-gnn.
Open Datasets Yes We benchmark our method on 7 datasets from (Dwivedi et al., 2020)... We also benchmark our methods on large-scale datasets, ZINC-full (a bigger version of ZINC (Irwin et al., 2012)) and PCQM4MV2 (Hu et al., 2021). Table 5. Overview of the graph learning datasets involved in this work (Dwivedi et al., 2020), (Irwin et al., 2012), (Hu et al., 2021).
Dataset Splits Yes In the same way as (Ma et al., 2023), we perform the experiments on the standard train/val/test splits.
Hardware Specification No The paper does not provide specific hardware details such as GPU or CPU models used for the experiments.
Software Dependencies No The paper mentions building on the codebase of (Ma et al., 2023) which is itself built on (Ramp aˇsek et al., 2022), but does not provide specific software dependencies with version numbers.
Experiment Setup Yes All hyperparameters for this sections are reported in the appendix in tables 2 and 3.