Statistical Foundations of Prior-Data Fitted Networks
Authors: Thomas Nagler
ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | The main contribution of this work is establishing the theoretical foundation for PFNs and identifying statistical mechanisms explaining their empirical behavior. ... 6.5. Numerical Validation |
| Researcher Affiliation | Academia | 1Department of Statistics, LMU Munich, Munich, Germany 2Munich Center for Machine Learning, Munich, Germany. |
| Pseudocode | No | The paper does not contain structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | An R script to reproduce the results can be found at https://gist.github.com/tnagler/62f6ce1f996333c799c81f1aef147e72. |
| Open Datasets | No | The paper uses simulated data sets and does not provide concrete access information for a publicly available or open dataset. |
| Dataset Splits | No | The paper describes data generation for evaluation but does not specify traditional training/validation/test dataset splits. |
| Hardware Specification | No | The paper does not provide specific hardware details used for running its experiments. |
| Software Dependencies | Yes | We run the pre-trained Tab PFN of Hollmann et al. (2022, pip version 0.1.8). |
| Experiment Setup | Yes | We simulate 500 data sets Dn from the model p0(1 | X) = 1/2 + sin(1 X)/2 with Y {0, 1}, X N(0, I5), and run the pre-trained Tab PFN of Hollmann et al. (2022, pip version 0.1.8). ... Figure 1 also shows the results of a localized version of Tab PFN (as in Section 6.4 with kn = min{500, n4/(d+4) }). |