Beyond probability partitions: Calibrating neural networks with semantic aware grouping
Authors: Jia-Qi Yang, De-Chuan Zhan, Le Gan
NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experimental results demonstrate that our approach achieves significant performance improvements across multiple datasets and network architectures, thus highlighting the importance of the partitioning function for calibration. |
| Researcher Affiliation | Academia | Jia-Qi Yang De-Chuan Zhan Le Gan State Key Laboratory for Novel Software Technology Nanjing University, Nanjing, 210023, China |
| Pseudocode | Yes | Algorithm 1 Train group calibration with temperature scaling (GC+TS) |
| Open Source Code | Yes | Code and Appendix are available at https://github.com/ThyrixYang/group_calibration |
| Open Datasets | Yes | To evaluate the performance of our method under various circumstances, we selected three datasets: CIFAR10, CIFAR100[31], and Imagenet[1]. |
| Dataset Splits | Yes | We randomly partitioned a validation set Dval from the standard training set: CIFAR10 and CIFAR100 adopted 10% of the data for validation, while Imagenet utilized 5%. |
| Hardware Specification | No | The paper does not provide specific details about the hardware used for the experiments (e.g., GPU models, CPU types). |
| Software Dependencies | No | The paper does not provide specific version numbers for software dependencies used in the experiments. |
| Experiment Setup | Yes | The hyperparameters of the comparative methods were tuned based on the corresponding literature with 5-fold cross-validation on the CIFAR10-Resnet152 dataset. We fixed the number of groups at K = 2 and the number of partitions at U = 20, although 20 is not necessarily the optimal value. The strength of regularization was set to λ = 0.1, following a similar tuning approach as the comparative methods. |