Optimal Teaching for Limited-Capacity Human Learners
Authors: Kaustubh R Patil, Xiaojin Zhu, Łukasz Kopeć, Bradley C. Love
NeurIPS 2014 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Using the machine teacher, we derive a variety of optimal training sets for lowand high-capacity GCM learners. We then evaluate how humans perform when trained on these recommended items (i.e. training sets). The main predictions are that the machine teacher will idealize training sets and that humans will perform better on optimal training sets calculated using the low-capacity GCM variant. In what follows, we first specify parameter values for the GCM variants, present the optimal teaching sets we calculate, and then discuss human experiments. |
| Researcher Affiliation | Academia | Kaustubh Raosaheb Patil Affective Brain Lab, UCL & MIT Sloan Neuroeconomics Lab kaustubh.patil@gmail.com Xiaojin Zhu Department of Computer Sciences University of Wisconsin-Madison jerryzhu@cs.wisc.edu Łukasz Kope c Experimental Psychology University College London l.kopec.12@ucl.ac.uk Bradley C. Love Experimental Psychology University College London b.love@ucl.ac.uk |
| Pseudocode | No | The paper provides mathematical formulations and descriptions of the methods, but it does not include any clearly labeled "Pseudocode" or "Algorithm" blocks. |
| Open Source Code | No | The paper does not contain any statement about releasing source code for the described methodology, nor does it provide a link to a code repository. |
| Open Datasets | Yes | The conditional distribution p(y = 1 | x = zj) for j = 1 . . . 60 was adapted from a related study [3]. |
| Dataset Splits | No | The paper mentions 'training sets' and 'test sets' but does not specify any training/validation/test dataset splits, exact percentages, or sample counts for reproduction. |
| Hardware Specification | No | The paper does not explicitly state the specific hardware (e.g., GPU models, CPU types) used to run the computational experiments or analyses. |
| Software Dependencies | No | The paper does not list any specific software dependencies (e.g., libraries, frameworks, or solvers) with their version numbers. |
| Experiment Setup | Yes | Parameters were set for the low-capacity GCM model by fitting the behavioral data from Experiment 2 of Gigu ere and Love [3]. ... {ˆb, ˆc, ˆγ} = {5.066, 2.964, 4.798}. We define a high-capacity GCM by only changing the ˆγ parameter, which is set an order of magnitude higher at ˆγ = 47.98. All training sets had size n = 20... |