Learning Graph Cellular Automata

Authors: Daniele Grattarola, Lorenzo Livi, Cesare Alippi

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We now consider different experiments aimed at showcasing the capabilities of our GNCA architecture. We take inspiration from the literature on learning lattice-based CA to design three experimental settings with different goals.
Researcher Affiliation Academia Daniele Grattarola Università della Svizzera italiana grattd@usi.ch Lorenzo Livi University of Manitoba Cesare Alippi Università della Svizzera italiana Politecnico di Milano
Pseudocode Yes Algorithm 1: Pseudo-code for Boids [7].
Open Source Code Yes Code is available online (see supplementary material).
Open Datasets Yes We consider several geometric graphs available in the Py GSP library [30] (BSD 3-Clause license)
Dataset Splits Yes We generate 300 trajectories for training, 30 for validation and early stopping, and 30 for testing the final performance of the GNCA.
Hardware Specification No The paper states 'See supplementary material' for compute and resource details, but these specifications are not present in the main paper.
Software Dependencies No The paper mentions the 'Py GSP library [30]' but does not provide specific version numbers for this or any other software dependencies.
Experiment Setup Yes We generate training examples for the model by sampling mini-batches of 32 random binary states [S(1), . . . , S(32)], S(k) Sn, and we train the GNCA by minimising the negative log-likelihood between the true successor states τ(S(k)) and the predicted next states τθ(S(k)).