Stay on path: PCA along graph paths

Authors: Megasthenis Asteris, Anastasios Kyrillidis, Alex Dimakis, Han-Gyol Yi, Bharath Chandrasekaran

ICML 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We empirically evaluate our schemes on synthetic and real datasets.
Researcher Affiliation Academia Department of Electrical and Computer Engineering, The University of Texas at Austin Department of Communication Sciences & Disorders, The University of Texas at Austin
Pseudocode Yes Algorithm 1 Graph-Truncated Power Method; Algorithm 2 Low-Dimensional Sample and Project
Open Source Code No The paper does not contain any explicit statements or links indicating that the source code for the described methodology is publicly available.
Open Datasets No The paper mentions using S&P 500 Index data from 'Yahoo! Finance' and fMRI data from the 'Human Connectome Project'. While these are known sources, the paper does not provide concrete access information (specific links, DOIs, or direct repository access) to the exact datasets used for replication by the authors, nor are they universally established public datasets that implicitly convey access details.
Dataset Splits No The paper discusses 'n samples' but does not specify any train/validation/test dataset splits (e.g., percentages or counts).
Hardware Specification No The paper does not provide any specific details about the hardware (e.g., GPU models, CPU types) used to run the experiments.
Software Dependencies No The paper does not provide specific version numbers for any software dependencies or libraries used in the experiments.
Experiment Setup No While the paper describes aspects of data generation (e.g., k = log p, d = p/k for synthetic data), it does not provide specific hyperparameter values or detailed system-level training settings for the algorithms themselves (e.g., learning rates, number of epochs, convergence thresholds).