Common Ground in Cooperative Communication
Authors: Xiaoran Hao, Yash Jhaveri, Patrick Shafto
NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Finally, we carry out a series of empirical simulations to support and elaborate on our theoretical results. |
| Researcher Affiliation | Academia | 1Department of Math and Computer Science, Rutgers University Newark 2School of Mathematics, Institute for Advanced Study, Princeton |
| Pseudocode | No | The paper does not contain any pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any concrete access to source code for the methodology. |
| Open Datasets | No | The paper describes generating data for its experiments (e.g., 'we assume that m = |D| and n = |H|, and we fix our common ground pair as follows: Pg = {g (d | h)g(h)} with g (d | h) = Cat(d | (h)) and g(h) = Cat(d | )' and 'g is a mixture of l Gaussians'), but it does not specify the use of a publicly available or open dataset with access information. |
| Dataset Splits | No | The paper does not specify explicit train/validation/test splits or reference predefined splits for a dataset. It describes 'sampling initializations for a gradient descent based optimization scheme' and mentions '50 initializations' or '100 initializations' but not specific data splits for validation. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running its experiments. |
| Software Dependencies | No | The paper mentions 'Py Torch via Adam (Kingma and Ba, 2015)' but does not provide specific version numbers for PyTorch or Adam. |
| Experiment Setup | Yes | We sample initializations for a gradient descent based optimization scheme of L ,δ, in Py Torch via Adam (Kingma and Ba, 2015), over (Pg f) from a pair of probability distributions and on the parameters and λ. ... we consider a multilayer perceptron-based form of common ground... We conduct experiments under various priors f and g and compare different initializations. We also analyze our model through variations on the coefficients and δ. |