Compositional languages emerge in a neural iterated learning model
Authors: Yi Ren, Shangmin Guo, Matthieu Labeau, Shay B. Cohen, Simon Kirby
ICLR 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our experiments confirm our analysis, and also demonstrate that the emerged languages largely improve the generalizing power of the neural agent communication. |
| Researcher Affiliation | Academia | 1 University of Edinburgh, United Kingdom, 2 University of Cambridge, United Kingdom 3 LTCI, T el ecom Paris, Institut Polytechnique de Paris, France |
| Pseudocode | Yes | Algorithm 1: The NIL algorithm. |
| Open Source Code | Yes | The code is available at https://github.com/Joshua-Ren/Neural_Iterated_Learning. |
| Open Datasets | No | The paper describes generating its own object space based on attributes (Na) and values (Nv), rather than using a publicly available dataset. |
| Dataset Splits | Yes | We measure this ability by looking at their validation game performance: we restrict the training examples to a limited numbers of objects (i.e., the training set), and look at how good are the agents at playing the game on the others (i.e., the validation set)." and "Valid set size 0 8 16 32 |
| Hardware Specification | No | The paper does not provide specific hardware details such as GPU/CPU models, processor types, or memory amounts used for experiments. |
| Software Dependencies | No | The paper does not provide specific software dependencies with version numbers (e.g., Python, PyTorch, CUDA versions). |
| Experiment Setup | Yes | Unless specifically stated, the experiments mentioned in this paper use the hyper-parameters given in Table 3. |