Variance Reduction in Stochastic Particle-Optimization Sampling
Authors: Jianyi Zhang, Yang Zhao, Changyou Chen
ICML 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our theoretical results are veriļ¬ed by a number of experiments on both synthetic and real datasets. |
| Researcher Affiliation | Academia | 1Duke University 2University at Buffalo, SUNY. |
| Pseudocode | Yes | Algorithm 1 SAGA-POS; Algorithm 2 SVRG-POS; Algorithm 3 SVRG-POS+ |
| Open Source Code | No | The paper does not provide a statement about releasing open-source code or a link to a code repository. |
| Open Datasets | Yes | We test the proposed algorithms for Bayesian-logistic-regression (BLR) on four publicly available datasets from the UCI machine learning repository: Australian (690-14), Pima (768-8), Diabetic (1151-20) and Susy (100000-18), where (N d) means a dataset of N data points with dimensionality d. |
| Dataset Splits | No | The datasets are split into 80% training data and 20% testing data. There is no explicit mention of a separate validation set split. |
| Hardware Specification | No | No specific hardware details such as GPU models, CPU types, or memory specifications were mentioned for running experiments. |
| Software Dependencies | No | The paper does not provide specific software dependencies or version numbers (e.g., Python, PyTorch, TensorFlow versions, or library versions). |
| Experiment Setup | Yes | Optimized constant stepsizes are applied for each algorithm via grid search. ... The minibatch size is set to 15 for all experiments. ... averaging over 10 runs with 50 particles. |