High-Dimensional Analysis for Generalized Nonlinear Regression: From Asymptotics to Algorithm
Authors: Jian Li, Yong Liu, Weiping Wang
AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We conduct experiments to explore the impacts of nonlinear feature mappings and subsampling, respectively. We leave the proofs and more experiments in the appendix. Our contributions can be summarized as follows: ... Finally, we validate our theoretical findings and the proposed algorithm through several experiments. |
| Researcher Affiliation | Academia | Jian Li1, Yong Liu2*, Weiping Wang1 1Institute of Information Engineering, Chinese Academy of Sciences 2Gaoling School of Artificial Intelligence, Renmin University of China |
| Pseudocode | No | The paper describes the algorithm RFRed and its optimization steps but does not include a formally structured pseudocode block or an 'Algorithm' section. |
| Open Source Code | Yes | Code: https://github.com/superlj666/Nonlinear HDA |
| Open Datasets | Yes | The training examples n = 100 are randomly drawn from the MNIST dataset (Le Cun et al. 1998). |
| Dataset Splits | No | The paper mentions 'training examples' and 'test errors' but does not provide specific details on how the dataset was split into training, validation, and test sets (e.g., percentages, sample counts, or explicit mention of a validation set). |
| Hardware Specification | No | The paper does not provide any specific details about the hardware (e.g., GPU/CPU models, memory) used for running the experiments. |
| Software Dependencies | No | Our implementation is based on Py Torch, and we fine-tune the hyperparameters through a grid search approach, exploring values for σ2 in the range of {0.01, ..., 1000} and λ {0.1, ..., 10^-5}. (Mentions PyTorch but no version). |
| Experiment Setup | Yes | We fix n = 100, S = In, m = n and change the random features dimension p [10, 400]. ... We set the same hyperparameter σ2 = 0.1. ... We fine-tune the hyperparameters through a grid search approach, exploring values for σ2 in the range of {0.01, ..., 1000} and λ {0.1, ..., 10^-5}. |