Approximate Manifold Regularization: Scalable Algorithm and Generalization Analysis
Authors: Jian Li, Yong Liu, Rong Yin, Weiping Wang
IJCAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive empirical results reveal that our method achieves the state-of-the-art performance in a short time even with limited computing resources. |
| Researcher Affiliation | Academia | 1Institute of Information Engineering, Chinese Academy of Sciences 2School of Cyber Security, University of Chinese Academy of Sciences |
| Pseudocode | Yes | Algorithm 1 Nystr om Lap RLS with PCG (Nystr om-PCG) |
| Open Source Code | No | The paper does not provide concrete access to source code for the methodology described. |
| Open Datasets | No | The paper mentions datasets like 'space ga', 'phishing', 'a8a', 'w7a', 'a9a', 'ijcnn1', 'cod-rna', 'connect-4', 'skin nonskin', 'Year Prediction' but does not provide specific links, DOIs, repositories, or formal citations (author and year) to confirm their public availability or how to access them. |
| Dataset Splits | Yes | Using the chosen parameters determined by 10-folds cross-validation, we run all methods 30 times with randomly select 70% for training and 30% for testing on each dataset. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., exact GPU/CPU models, memory amounts) used for running its experiments. |
| Software Dependencies | No | The paper does not provide specific ancillary software details with version numbers (e.g., library or solver names with version numbers like Python 3.8, CPLEX 12.4) needed to replicate the experiment. |
| Experiment Setup | Yes | We choose kernel parameter σ and regular parameters (λ in standard RLS and λA, λI in Lap RLS methods) in 2i, i { 15, 14, , 14, 15}, by minimizing test error via 10-folds cross-validation. For each dataset, we use Gaussian kernel K(xi, xj) = exp( xi xj /2σ2). |