Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].
Information Processing Equalities and the Information–Risk Bridge
Authors: Robert C. Williamson, Zac Cranko
JMLR 2024 | Venue PDF | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We introduce two new classes of measures of information for statistical experiments which generalise and subsume φ-divergences, integral probability metrics, N-distances (MMD), and (f, Γ) divergences between two or more distributions. This enables us to derive a simple geometrical relationship between measures of information and the Bayes risk of a statistical decision problem, thus extending the variational φ-divergence representation to multiple distributions in an entirely symmetric manner. The new families of divergence are closed under the action of Markov operators which yields an information processing equality which is a refinement and generalisation of the classical information processing inequality. |
| Researcher Affiliation | Academia | Robert C. Williamson EMAIL University of Tübingen and Tübingen AI Center, Germany Zac Cranko EMAIL Sydney, Australia |
| Pseudocode | No | The paper contains extensive mathematical derivations, definitions, theorems, and proofs (e.g., Proposition 4, Lemma 10, Theorem 23), but no sections explicitly labeled as 'Pseudocode' or 'Algorithm,' nor any structured algorithmic steps. |
| Open Source Code | No | The paper does not contain any explicit statements regarding the release of source code for the methodology described, nor does it provide links to code repositories. The license information provided refers to the paper itself, not associated software. |
| Open Datasets | No | The paper is a theoretical work that defines and explores properties of information measures and their relationship to Bayes risk. It does not conduct empirical studies that would require specific datasets. Concepts like 'φ-divergence' and 'Kullback-Leibler divergence' are discussed as mathematical tools rather than being applied to specific datasets for experimental evaluation. |
| Dataset Splits | No | The paper does not describe any experimental evaluation using datasets, therefore, there is no mention of dataset splits like training, validation, or test sets. |
| Hardware Specification | No | This theoretical paper does not describe any computational experiments or implementations; consequently, there are no details provided regarding hardware specifications such as GPU models, CPU types, or computational resources used. |
| Software Dependencies | No | As a theoretical paper focusing on mathematical frameworks, no specific software dependencies with version numbers (e.g., programming languages, libraries, or solvers) are mentioned for experimental reproducibility. |
| Experiment Setup | No | The paper is purely theoretical, introducing and analyzing mathematical concepts and relationships. It does not describe any empirical experimental setup, hyperparameter configurations, or training details for models. |