Particle-based Variational Inference with Preconditioned Functional Gradient Flow
Authors: Hanze Dong, Xi Wang, LIN Yong, Tong Zhang
ICLR 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | To validate the effectiveness of our algorithm, we have conducted experiments on both synthetic and real datasets. |
| Researcher Affiliation | Academia | Department of Mathematics, HKUST Department of Computer Science and Engineering, HKUST College of Information and Computer Science, UMass Amherst |
| Pseudocode | Yes | Algorithm 1 PFG: Preconditioned Functional Gradient Flow |
| Open Source Code | No | The paper does not provide any explicit statement about releasing source code or a link to a code repository. |
| Open Datasets | Yes | For logistic regression, we conduct Bayesian logistic regression for binary classification task on Sonar and Australian dataset (Dua & Graff, 2017). ... We use two-layer networks with 50 hidden units (100 for Year dataset, 128 for MNIST) and Re LU activation function; |
| Dataset Splits | Yes | The parameters are chosen by validation. More detailed settings are provided in the Appendix. |
| Hardware Specification | Yes | All experiments are conducted on Python 3.7 with NVIDIA 2080 Ti. |
| Software Dependencies | Yes | All experiments are conducted on Python 3.7 with NVIDIA 2080 Ti. Particularly, we use Py Torch 1.9 to build models. Besides, Numpy, Scipy, Sklearn, Matplotlib, Pillow are used in the models. |
| Experiment Setup | Yes | Without special declarations, we use parametric two-layer neural networks with Sigmoid activation as our function class. To approximate H in real datasets, we use the approximated diagonal Hessian matrix ˆH, and choose H = ˆHα, where α ∈ {0, 0.1, 0.2, 0.5, 1}; the inner loop T of PFG is chosen from {1, 2, 5, 10}, the hidden layer size is chosen from {32, 64, 128, 256, 512}. |