Cross-Space Active Learning on Graph Convolutional Networks

Authors: Yufei Tao, Hao Wu, Shiyuan Deng

ICML 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical This paper formalizes cross-space active learning on a graph convolutional network (GCN). The objective is to attain the most accurate hypothesis available in any of the instance spaces generated by the GCN. Subject to the objective, the challenge is to minimize the label cost, measured in the number of vertices whose labels are requested. Our study covers both budget algorithms which terminate after a designated number of label requests, and verifiable algorithms which terminate only after having found an accurate hypothesis. A new separation in label complexity between the two algorithm types is established. The separation is unique to GCNs. No empirical results are presented in the paper.
Researcher Affiliation Academia 1Department of Computer Science and Engineering, Chinese University of Hong Kong, Hong Kong, China. Correspondence to: Yufei Tao <taoyf@cse.cuhk.edu.hk>.
Pseudocode Yes Budget-CSAL(L, δ, H) 1. for t 1 to T do 2. ht output of Act Learn B(L/(2T), δ/2, H) with D = Ut 3. H = {(t, ht) | t [T]} 4. gt,h output of Act Learn B(L/2, δ/2, GH) with D = U /* see (9) for GH */ 5. return (t, h)
Open Source Code No The paper does not contain any explicit statements about releasing code or providing links to a code repository for the described methodology.
Open Datasets No The paper is theoretical and does not conduct experiments requiring a dataset. It references general datasets in related work, but provides no specific access information for data used in its own analysis.
Dataset Splits No This paper is theoretical and does not report on empirical experiments; therefore, it does not provide details on training, validation, or test dataset splits.
Hardware Specification No This paper is theoretical and does not report on empirical experiments; therefore, it does not provide specific hardware specifications used for running experiments.
Software Dependencies No This paper is theoretical and focuses on algorithm design and proofs, without providing implementation details or specific software dependencies with version numbers.
Experiment Setup No This paper is theoretical and focuses on algorithm design and proofs, without providing details about an experimental setup, hyperparameters, or system-level training settings.