Semi-supervised Learning with Support Isolation by Small-Paced Self-Training
Authors: Zheng Xie, Hui Sun, Ming Li
AAAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments on both benchmark and pneumonia diagnosis tasks show that our method is effective. |
| Researcher Affiliation | Academia | Zheng Xie, Hui Sun, Ming Li National Key Laboratory for Novel Software Technology, Nanjing University, Nanjing 210023, China {xiez,sunh,lim}@lamda.nju.edu.cn |
| Pseudocode | Yes | Algorithm 1: Small-Paced Self-Training Framework ... Algorithm 2: Small-Paced Self-Training Algorithm |
| Open Source Code | No | The paper does not provide any statements about releasing source code or links to a code repository. |
| Open Datasets | Yes | We compare the methods on commonly used CIFAR10, CIFAR100 (Krizhevsky 2009) dataset and real-world X-ray pneumonia identification task (Kermany et al. 2018) |
| Dataset Splits | No | The paper mentions using well-known datasets but does not explicitly provide the training, validation, or test split percentages or sample counts used for its experiments. |
| Hardware Specification | No | The paper does not provide specific details about the hardware (e.g., GPU models, CPU types, memory) used to run the experiments. |
| Software Dependencies | No | The paper mentions using Res Net-50 as a backbone but does not specify any software dependencies with version numbers. |
| Experiment Setup | Yes | For linear base models f(x) = w x + b, we optimize ramp loss with ℓ2 regularization... For deep models, we use Large Margin Deep Networks (Elsayed et al. 2018) as the base models... Generally, for datasets with normalized features, we search δ in [0.1, 0.5]... All methods adopt Res Net-50 pre-trained on imagenet as the backbone... those predictions with large margin |f (t)(x U)| > θ will be accepted as pseudo-labeled data D(t) U. if | D(t) U | = 0 then decrease θ. |