Tight Rates in Supervised Outlier Transfer Learning
Authors: Mohammadreza Mousavi Kalan, Samory Kpotufe
ICLR 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | In this work, we adopt the traditional framework of Neyman Pearson classification which formalizes supervised outlier detection with the added assumption that one has access to some related but imperfect outlier data. Our main results are as follows: We first determine the information-theoretic limits of the problem under a measure of discrepancy that extends some existing notions from traditional balanced classification; interestingly, unlike in balanced classification, seemingly very dissimilar sources can provide much information about a target, thus resulting in fast transfer. We then show that, in principle, these information theoretic limits are achievable by adaptive procedures, i.e., procedures with no a priori information on the discrepancy between source and target outlier distributions. |
| Researcher Affiliation | Academia | Mohammadreza M. Kalan Statistics, Columbia University mm6244@columbia.edu Samory Kpotufe Statistics, Columbia University samory@columbia.edu |
| Pseudocode | No | The paper describes a "Procedure (4.4)" in Section 4.8, but it is described in prose and mathematical notation rather than a structured pseudocode or algorithm block. |
| Open Source Code | No | The paper is theoretical and does not mention or provide access to any open-source code for the described methodology. |
| Open Datasets | No | The paper focuses on theoretical analysis using abstract distributions (e.g., µ0, µ1,S, µ1,T) and does not refer to specific, real-world datasets or their public availability. |
| Dataset Splits | No | The paper is theoretical and does not describe experiments with specific datasets that would require training, validation, or test splits. It focuses on theoretical bounds and achievability. |
| Hardware Specification | No | The paper is theoretical and focuses on mathematical derivations and proofs. It does not mention any hardware used for running experiments. |
| Software Dependencies | No | The paper is theoretical and does not mention any software dependencies or versions used for conducting experiments or implementing the described methods. |
| Experiment Setup | No | The paper is theoretical and does not describe an experimental setup with specific hyperparameters, training configurations, or system-level settings, as it does not conduct empirical experiments. |