Deep Hierarchical Graph Convolution for Election Prediction from Geospatial Census Data
Authors: Mike Li, Elija Perrier, Chang Xu647-654
AAAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In this paper, we demonstrate the utility of GCNNs for GIS analysis via a multi-graph hierarchical spatial-filter GCNN network model in the context of GIS systems to predict election outcomes using socio-economic features drawn from the 2016 Australian Census. We report a marked improvement in performance accuracy of Hierarchical GCNNs over benchmark generalised linear models and standard GCNNs, especially in semi-supervised tasks. Our experiments were designed to test the performance of the Hierarchical GCNN s predictions of Labor 2PP by SA1 using Census features by comparison with a standard GCNN, multi-layer perceptron network (MLP) and the GLMs. |
| Researcher Affiliation | Academia | 1 Centre for Complex Systems, The University of Sydney, Sydney, Australia 2 Centre for Quantum Software and Information, The University of Technology, Sydney, Australia 3 UBTECH Sydney AI Centre, School of Computer Science, FEIT, University of Sydney, Australia |
| Pseudocode | No | The paper does not contain any pseudocode or algorithm blocks. It describes the mathematical formulations and architecture variants in text and equations. |
| Open Source Code | Yes | Code for the models is provided at https://github.com/mili7522/Hierarchical-GCNN. |
| Open Datasets | No | The paper mentions using 'socio-economic features drawn from the 2016 Australian Census' and 'Australian election results (from the Australian Electoral Commission (AEC))' but does not provide concrete access information like a link, DOI, or a formal citation with authors and year for these specific datasets to be considered publicly available for reproduction outside of their own systems. While they are government data sources, direct access details for the *exact* data used are not provided in the paper itself. |
| Dataset Splits | Yes | Each experiment included both a standard supervised training problem with five-fold cross validation (80% training, 20% test) and two semi-supervised tasks with (i) 20% training, 80% test and (ii) 10% training, 90% test. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., CPU, GPU models, or memory) used for running the experiments. It only mentions general settings like 'neurons per layer'. |
| Software Dependencies | No | The paper mentions 'Sci-Kit Learn' and 'open source QGIS software' but does not provide specific version numbers for these or any other software dependencies. |
| Experiment Setup | Yes | A standard setting of two graph convolutional layers with 128 neurons for each except the last layer was used throughout the tests, although Fig 4 explores the performance under different settings for these hyperparameters. The Re LU activation function and the ADAM optimiser were used. |