Pointwise Bounds for Distribution Estimation under Communication Constraints
Authors: Wei-Ning Chen, Peter Kairouz, Ayfer Ozgur
NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | This dimension independent convergence is also empirically verified by our experiments (see Section 3 for more details). In Figure 1, we empirically compare our scheme with [4] (which is globally minimax optimal). |
| Researcher Affiliation | Collaboration | Wei-Ning Chen Department of Electrical Engineering Stanford University wnchen@stanford.edu Peter Kairouz Google Research kairouz@google.com Ayfer Özgür Department of Electrical Engineering Stanford University aozgur@stanford.edu |
| Pseudocode | Yes | Algorithm 1: uniform grouping [4] (at client i), Algorithm 2: localize-and-refine (at client i) |
| Open Source Code | No | The paper does not provide an explicit statement or link for open-source code related to the methodology described. |
| Open Datasets | No | The paper discusses experiments with 's-sparse distributions', 'truncated geometric distributions', and 'truncated Zipf distributions' which are theoretical models for generating data, not named, publicly accessible datasets with concrete access information. |
| Dataset Splits | No | The paper focuses on theoretical bounds and simulated data from specific distributions, and therefore does not discuss training, validation, or test dataset splits in the context of empirical experiments. |
| Hardware Specification | No | The paper does not provide any specific details about the hardware used for running its experiments or simulations. |
| Software Dependencies | No | The paper does not provide specific software dependencies or version numbers needed to replicate the experimental results. |
| Experiment Setup | No | The paper describes algorithms but does not provide specific experimental setup details such as hyperparameters, learning rates, or batch sizes for training. |