Collaborative Learning for Deep Neural Networks
Authors: Guocong Song, Wei Chai
NeurIPS 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | The empirical results on CIFAR and Image Net datasets demonstrate that deep neural networks learned as a group in a collaborative way significantly reduce the generalization error and increase the robustness to label noise. |
| Researcher Affiliation | Industry | Guocong Song Playground Global Palo Alto, CA 94306 songgc@gmail.com Wei Chai Google Mountain View, CA 94043 chaiwei@google.com |
| Pseudocode | No | The paper describes procedures and optimizations in paragraph form but does not include any structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not contain any statements about releasing code for the described methodology or provide a link to a code repository. |
| Open Datasets | Yes | The two CIFAR datasets, CIFAR-10 and CIFAR-100, consist of colored natural images with 32x32 pixels [13]... The ILSVRC 2012 classification dataset consists of 1.2 million for training, and 50,000 for validation [6]. |
| Dataset Splits | Yes | The ILSVRC 2012 classification dataset consists of 1.2 million for training, and 50,000 for validation [6]. ... Table 4: Validation errors of Res Net-50 on Image Net. |
| Hardware Specification | No | The paper mentions 'graphics processing unit (GPU)' generally and discusses 'GPU memory consumption' but does not specify any particular GPU model (e.g., NVIDIA, A100) or other hardware components like CPU type or memory size. |
| Software Dependencies | No | The paper states 'All experiments are conducted with Tensorflow [1].' but does not provide a specific version number for TensorFlow or any other software dependencies used. |
| Experiment Setup | Yes | We use T = 2 and β = 0.5 for all experiments. ... Refer to Section 2 in Supplementary material for the detailed training setup. |