Self-Supervised Learning with an Information Maximization Criterion
Authors: Serdar Ozsoy, Shadi Hamdan, Sercan Arik, Deniz Yuret, Alper Erdogan
NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Numerical experiments demonstrate that Cor Info Max achieves better or competitive performance results relative to the state-of-the-art SSL approaches. Section 5 provides the numerical experiments illustrating the performance of the proposed approach. |
| Researcher Affiliation | Collaboration | 1KUIS AI Center, Koc University, Turkey 2EEE Department, Koc University, Turkey 3CE Department, Koc University, Turkey 4Google Cloud AI Research, Sunnyvale, CA |
| Pseudocode | Yes | (See Appendix C for Cor Info Max pseudocode.) |
| Open Source Code | Yes | Cor Info Max s source code is publicly available in https://github.com/serdarozsoy/ corinfomax-ssl |
| Open Datasets | Yes | Datasets: We perform experiments on CIFAR-10, CIFAR-100 [40], Tiny Image Net [41], COCO [42], Image Net-100 and Image Net-1K [43] datasets1. |
| Dataset Splits | Yes | Then, we obtain the test accuracy results for the trained linear classifier based on the validation dataset. |
| Hardware Specification | Yes | For Image Net-100 and Image Net-1K, we pretrain our model on up to 8 A100 Cloud GPUs. The remaining datasets are trained using a single T4 and V100 Cloud GPU. |
| Software Dependencies | No | The paper mentions software components implicitly through the context of deep learning (e.g., training models), but does not provide specific version numbers for any libraries, frameworks, or languages used. |
| Experiment Setup | Yes | For pretraining, we use 1000 epochs with a batch size of 512 for CIFAR datasets, and 800 epochs with a batch size of 1024 for Tiny Image Net. [...] We use the SGD optimizer with a momentum of 0.9 and a weight decay of 1 10 4. The initial learning rate is 0.5 for CIFAR datasets and Tiny Image Net, 1.0 for Image Net-100, and 0.2 for Image Net-1K. |