Safe Deep Semi-Supervised Learning for Unseen-Class Unlabeled Data
Authors: Lan-Zhe Guo, Zhen-Yu Zhang, Yuan Jiang, Yu-Feng Li, Zhi-Hua Zhou
ICML 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | To validate the effectiveness of the proposed method, we conduct experiments on two standard MNIST and CIFAR benchmarks for semi-supervised image classification using deep convolutional neural networks (CNNs). |
| Researcher Affiliation | Academia | 1National Key Laboratory for Novel Software Technology, Nanjing University, Nanjing 210023, China. |
| Pseudocode | Yes | Algorithm 1 The DS3L Learning Framework |
| Open Source Code | Yes | The code for the work is readily available and freely downloaded at https://www.lamda.nju.edu.cn/code DS3L.ashx. |
| Open Datasets | Yes | To validate the effectiveness of the proposed method, we conduct experiments on two standard MNIST and CIFAR benchmarks for semi-supervised image classification using deep convolutional neural networks (CNNs). |
| Dataset Splits | No | The paper describes the construction of labeled and unlabeled training data, and refers to a 'test' set, but does not explicitly provide information about a separate 'validation' dataset split or percentage. |
| Hardware Specification | No | The paper does not provide specific hardware details such as GPU models, CPU types, or cloud instance specifications used for running experiments. |
| Software Dependencies | No | The paper mentions popular deep learning frameworks like Pytorch and Tensorflow but does not specify their version numbers or other ancillary software dependencies with versions. |
| Experiment Setup | Yes | The networks are trained using stochastic gradient descent (SGD) methods with a learning rate 1e 3. We train the model for 200,000 updates with a batch size of 100. |