ZINB-Based Graph Embedding Autoencoder for Single-Cell RNA-Seq Interpretations

Authors: Zhuohan Yu, Yifu Lu, Yunhe Wang, Fan Tang, Ka-Chun Wong, Xiangtao Li4671-4679

AAAI 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments on 16 single-cell RNA-seq datasets from diverse yet representative single-cell sequencing platforms demonstrate the superiority of sc TAG over various state-of-the-art clustering methods.
Researcher Affiliation Academia 1 School of Artificial Intelligence, Jilin University, Jilin, China 2 School of Artificial Intelligence, Hebei University of Technology, Tianjin, China 3 Department of Computer Science, City University of Hong Kong, Hong Kong SAR
Pseudocode No The paper describes its methods using prose and mathematical equations but does not include any explicitly labeled pseudocode or algorithm blocks.
Open Source Code No The paper does not provide any specific links or explicit statements about the availability of the source code for the described methodology.
Open Datasets Yes We compared the performance of our model with other baseline methods on 16 real-world sc RNA-seq datasets from several representative sequencing platforms. ... All 16 datasets are from different species, including mouse and human, as well as from different organs, such as brain, lung, and kidney. ... a large mouse retina dataset called Macosko (Macosko et al. 2015)
Dataset Splits No The paper mentions hyperparameters and training parameters, but it does not specify explicit validation dataset splits (e.g., percentages or counts for a validation set) or a distinct validation phase for model selection based on such a split.
Hardware Specification Yes We conducted our experiments on a Ubuntu server with NVIDIA GTX 2080Ti GPU with 24 GB memory size.
Software Dependencies No The paper mentions "Tensorflow" but does not specify a version number or list other key software components with their versions.
Experiment Setup Yes In the proposed sc TAG method, the cell graph was constructed using KNN algorithm with the nearest neighbor parameter k=15 to build the cell graph. In the graph autoencoder, TAG was set as two layers of 128 and 15 nodes, and the nodes of three hidden layers in the fully connected decoder set at 128, 256, and 512. Our algorithm consists of pre-training and formal training, in which pre-training, epochs were set at 1000, while in formal training, epochs were set at 300. Our model was optimized using Adam algorithm with the learning rate 5e-4 in pre-training and 1e4 in formal training (Kingma and Ba 2015).