Learning Graphical State Transitions

Authors: Daniel D. Johnson

ICLR 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental 5 EXPERIMENTS I evaluated the GGT-NN model on the b Ab I tasks, a set of simple natural-language tasks, where each task is structured as a sequence of sentences followed by a query (Weston et al., 2016). ... Results are shown in Tables 1 and 2.
Researcher Affiliation Academia Daniel D. Johnson Department of Computer Science Harvey Mudd College 301 Platt Boulevard ddjohnson@hmc.edu
Pseudocode Yes Algorithm 1 Graph Transformation Pseudocode
Open Source Code Yes 1The code for each transformation, and for the GGT-NN model itself, is available at https://github. com/hexahedria/gated-graph-transformer-network.
Open Datasets Yes I evaluated the GGT-NN model on the b Ab I tasks, a set of simple natural-language tasks, where each task is structured as a sequence of sentences followed by a query (Weston et al., 2016). ... The first task used was a 1-dimensional cellular automaton, specifically the binary cellular automaton known as Rule 30 (Wolfram, 2002).
Dataset Splits No No explicit detailed information on training/validation/test splits (e.g., percentages, absolute counts for validation set, or specific stratified methods) was provided. The paper states: 'I trained two versions of the GGT-NN model for each task: one with and one without direct reference. ... The GGT-NN model was trained on 1000 examples of the Rule 30 automaton... and 20,000 examples of Turing machines...'
Hardware Specification No No specific hardware details (e.g., CPU/GPU models, memory, number of cores) used for experiments were mentioned. The acknowledgments state: 'I would like to thank Harvey Mudd College for computing resources. I would also like to thank the developers of the Theano library, which I used to run my experiments. This work used the Extreme Science and Engineering Discovery Environment (XSEDE), which is supported by National Science Foundation grant number ACI-1053575.'
Software Dependencies No No specific version numbers for software dependencies were provided. The acknowledgments mention: 'I would like to thank the developers of the Theano library, which I used to run my experiments.'
Experiment Setup Yes Depending on the configuration of the transformations, a GGT-NN can take textual or graph-structured input, and produce textual or graph-structured output. ... Depending on the task, direct reference updates and per-sentence propagation can be enabled or disabled. The output function foutput will depend on the specific type of answer desired. If the answer is a single word, foutput can be a multilayer perceptron followed by a softmax operation. If the answer is a sequence of words, foutput can use a recurrent network (such as a GRU) to produce a sequence of outputs.