Online Variational Sequential Monte Carlo

Authors: Alessandro Mastrototaro, Jimmy Olsson

ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Finally, we illustrate OVSMC numerically on a number of classical SSM and more complex generative models, for which the method exhibits fast parameter learning and efficient adaptation of the particle proposal kernel. In the same numerical study, we also show that OVSMC is a strong challenger of VSMC on batch problems.
Researcher Affiliation Academia Alessandro Mastrototaro 1 Jimmy Olsson 1 1Department of Mathematics, KTH Royal Institute of Technology , Stockholm, Sweden.
Pseudocode Yes A pseudocode for our algorithm, which we refer to as online variational SMC (OVSMC), is displayed in Algorithm 2
Open Source Code Yes The Python code may be found at https://bitbucket. org/amastrot/ovsmc.
Open Datasets No The paper describes generating its own data for experiments (e.g., 'generated data under A = 0.8, B = 1 and Su = 0.5' and 'produced a long and partially observable video sequence'), but does not provide concrete access information (link, DOI, formal citation) to make these datasets publicly available or identify them as open datasets.
Dataset Splits No The paper describes processing data in an online fashion but does not specify exact train/validation/test splits, percentages, or sample counts for reproducibility.
Hardware Specification Yes All the experiments are run on an Apple Mac Book Pro M1 2020, memory 8GB.
Software Dependencies No Stochastic gradients are passed to the ADAM optimizer (Kingma & Ba, 2015) in Tensorflow 2.
Experiment Setup Yes We consider two cases for Sv {0.2, 1.2} corresponding to informative and more non-informative observations, respectively. ... We let rλ( | xt, yt+1) be N(µλ(xt, yt+1), σ2 λ(xt, yt+1)), where µλ and σ2 λ are two distinct neural networks with one dense hidden layer having three and two nodes, respectively, and relu activation functions.