Particle Flow Bayes’ Rule
Authors: Xinshi Chen, Hanjun Dai, Le Song
ICML 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | 4. Experiments We conduct experiments on multivariate Gaussian model, hidden Markov model and Bayesian logistic regression to demonstrate the generalization ability of PFBR and also its accuracy for posterior estimation. |
| Researcher Affiliation | Collaboration | 1School of Mathematics, 2School of Computational Science and Engineering, Georgia Institute of Technology, Atlanta, Georgia, USA. 3Ant Financial, Hangzhou, China. |
| Pseudocode | Yes | Algorithm 1 Overall Learning Algorithm |
| Open Source Code | No | The paper does not contain any explicit statement about releasing source code for the described methodology or a link to a code repository. |
| Open Datasets | Yes | Bayesian Logistic Regression (BLR). We consider logistic regression for digits classification on the MNIST8M 8 vs. 6 dataset which contains about 1.6M training samples and 1932 testing samples. |
| Dataset Splits | No | The paper mentions 'Perform a validation step on Dvali validation' in Algorithm 1, but does not provide specific details on the size or methodology of the validation split for any of its experiments. |
| Hardware Specification | No | The paper mentions 'gpu' in Table 1 caption, but does not provide specific details such as exact GPU/CPU models, processor types, or memory amounts used for running experiments. |
| Software Dependencies | No | The paper does not provide specific software dependencies or library versions (e.g., Python 3.8, PyTorch 1.9) required to replicate the experiments. |
| Experiment Setup | Yes | Since we use a batch size of 128 and consider 10 stages, the first gradient step of our method starts after around 103 samples are visited. [...] In our experiment, we use µx = 0, Σx = I and Σo = 3I. [...] We use 256 obtained particles as samples from p(x|Om) and compare it with true posteriors. |