$H$-Consistency Bounds for Pairwise Misranking Loss Surrogates
Authors: Anqi Mao, Mehryar Mohri, Yutao Zhong
ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In this section, we provide empirical results for general pairwise ranking with abstention on the CIFAR-10 dataset (Krizhevsky, 2009). |
| Researcher Affiliation | Collaboration | 1Courant Institute of Mathematical Sciences, New York, NY; 2Google Research, New York, NY. Correspondence to: Anqi Mao <aqmao@cims.nyu.edu>, Mehryar Mohri <mohri@google.com>, Yutao Zhong <yutao@cims.nyu.edu>. |
| Pseudocode | No | The paper presents mathematical derivations and theoretical concepts, but does not include any pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not contain any explicit statement about releasing source code or a link to a code repository for the methodology described. |
| Open Datasets | Yes | In this section, we provide empirical results for general pairwise ranking with abstention on the CIFAR-10 dataset (Krizhevsky, 2009). |
| Dataset Splits | No | The paper mentions that pairs are randomly sampled from CIFAR-10 for training and 10,000 pairs from the test data for evaluation, but it does not specify explicit training/validation/test splits (e.g., percentages or counts for each split). |
| Hardware Specification | No | The paper mentions using "Res Net-34" but does not provide any specific details about the hardware (e.g., GPU model, CPU, memory) used for the experiments. |
| Software Dependencies | No | The paper mentions models (Res Net-34), optimizers (Stochastic Gradient Descent with Nesterov momentum), and learning rate schedules (cosine decay), but it does not provide specific software versions for libraries or frameworks (e.g., PyTorch 1.9, TensorFlow 2.x). |
| Experiment Setup | Yes | We set the batch size, weight decay, and initial learning rate to 1,024, 1 10 4 and 0.1 respectively. We adopted the cosine decay learning rate schedule (Loshchilov & Hutter, 2016) for a total of 200 epochs. |