Human spatiotemporal pattern learning as probabilistic program synthesis

Authors: Tracey Mills, Josh Tenenbaum, Samuel Cheyette

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Here, we experimentally test human learning in the domain of structured 2-dimensional patterns, using a task in which participants repeatedly predicted where a dot would move based on its previous trajectory. We evaluate human performance against standard parametric and non-parametric time-series models, as well as two Bayesian program synthesis models...
Researcher Affiliation Academia Tracey E. Mills MIT temills@mit.edu Joshua B. Tenenbaum MIT jbt@mit.edu Samuel J. Cheyette MIT cheyette@mit.edu
Pseudocode No Table 1 shows the full set of primitives in the grammar. Operations (o) Values (v) (Controls) (Actions) (State variables) (Numbers) (Expressions) Repeat(o, v) Move() θ (current angle) N (naturals) Plus(v,v) Continue(o) Stay() s (current speed) R (reals) Minus(v,v) Concat(o, o) Turn(v) x (current x-position) Times(v,v) Subprogram(o) Accelerate(v) y (current y-position) Divide(v,v) Change X(v) t (current time) Mod(v,v) Change Y(v) n (function calls) Sin(v) Set X(v) Set Y(v). This table lists primitives but is not formatted as pseudocode or an algorithm block.
Open Source Code No No explicit statement or link for open-source code release by the authors was found in the paper.
Open Datasets Yes The set of all sequences is shown in full in Figure S3 in the Supplementary Materials.
Dataset Splits No No explicit description of traditional training, validation, and test dataset splits (e.g., percentages, sample counts, or specific predefined splits) for model development was found. The models learn sequentially from observed points in the sequence.
Hardware Specification No No specific hardware details (e.g., GPU/CPU models, memory specifications, or cloud instance types) used for running experiments were mentioned in the paper.
Software Dependencies No Each model was implemented in Gen [21] using Sequential Monte Carlo (SMC) with Markov Chain Monte Carlo (MCMC) rejuvenation steps. ... We used R s built-in optim function for fitting [24]. While specific tools are named, their version numbers are not provided.
Experiment Setup Yes Specifically, all models run SMC with 20 particles which, after an additional observation becomes available, are resampled if their effective sample size falls below 10. There are then 100,000 rejuvenation steps on the inferred hypothesis (including noise parameters) for each particle.