Unsupervised Nonlinear Feature Selection from High-Dimensional Signed Networks

Authors: Qiang Huang, Tingyu Xia, Huiyan Sun, Makoto Yamada, Yi Chang4182-4189

AAAI 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Through experiments on two real world datasets, we show that the proposed algorithm is superior to existing linear unsupervised feature selection methods. In this section, we report experiments on two real world signed network datasets that compared the framework proposed in this paper with five state-of-the-art unsupervised feature selection methods.
Researcher Affiliation Academia 1College of Computer Science and Technology, Jilin University, China 2School of Artificial Intelligence, Jilin University, China 3Key Laboratory of Symbolic Computation and Knowledge Engineering of Ministry of Education, China 4Kyoto University, Kyoto, Japan, 5RIKEN AIP, Kyoto, Japan
Pseudocode No No structured pseudocode or algorithm blocks were found in the paper.
Open Source Code No The paper does not provide an explicit statement about the availability of open-source code for the described methodology or a link to a code repository.
Open Datasets Yes We use two real world signed network datasets: Epinions and Wiki-Rf A. Epinions2: Epinions is a product review website on which users can share their comments on different products. (...) 2http://www.cse.msu.edu/ tangjili/trust.html Wiki-Rf A3: Wiki-Rf A is a signed network dataset concerning the election of editors on Wikipedia. (...) 3http://snap.stanford.edu/data/wiki-Rf A.html
Dataset Splits No The paper does not provide specific details on training, validation, or test dataset splits for the main experiments.
Hardware Specification Yes All codes are implemented by Python and we use Intel(R) Xeon(R) CPU E74870 @ 2.40GHz and 64GB memory.
Software Dependencies No The paper states 'All codes are implemented by Python' but does not provide specific version numbers for Python or any other key software libraries or solvers.
Experiment Setup Yes For the framework proposed in this paper, we set the block size as B = 50, the dimension of latent representation vector as c = 10, the controllers of similarity as η = 1 and η0 = 1, the number of hidden layers as N = 3, and select the top 10, 15, 20, 25, 30, 35 and 40 features, respectively. Therefore we set epoch = 200 to calculate the time consumption of Signed Lasso. We next compare the run time and peak memory usage of Signed Lasso with those of Net FS and Signed FS. We set the number of training steps for Net FS and Signed FS as 100 and 200, respectively.