Stable Learning via Sparse Variable Independence
Authors: Han Yu, Peng Cui, Yue He, Zheyan Shen, Yong Lin, Renzhe Xu, Xingxuan Zhang
AAAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments on both synthetic and real-world datasets demonstrate the improvement of covariate-shift generalization performance brought by SVI. |
| Researcher Affiliation | Academia | Han Yu1*, Peng Cui1 , Yue He1, Zheyan Shen1, Yong Lin2, Renzhe Xu1, Xingxuan Zhang1 1Tsinghua University 2Hong Kong University of Science and Technology |
| Pseudocode | Yes | Algorithm 1: Sparse Variable Independence (SVI) |
| Open Source Code | No | The paper does not provide any concrete access information to source code, such as a repository link or an explicit code release statement. |
| Open Datasets | No | The paper mentions "synthetic and real-world datasets" for experiments, but for the real-world datasets (House Price Prediction, People Income Prediction), it does not provide specific links, DOIs, repositories, or formal citations for public access. |
| Dataset Splits | No | The paper mentions training on one environment and testing on multiple, and tuning hyperparameters by grid search and validation, but does not provide specific dataset split percentages (e.g., 80/10/10) or sample counts for train/validation/test sets. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., exact GPU/CPU models, memory amounts) used for running its experiments. |
| Software Dependencies | No | The paper does not provide specific ancillary software details, such as library names with version numbers, needed to replicate the experiment. |
| Experiment Setup | No | The paper mentions that hyperparameters are tuned by grid search, but it does not provide concrete hyperparameter values (e.g., learning rate, batch size, number of epochs) or specific training configurations in the main text. |