BScNets: Block Simplicial Complex Neural Networks
Authors: Yuzhou Chen, Yulia R. Gel, H. Vincent Poor6333-6341
AAAI 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our experiments indicate that BSc Nets outperforms the state-of-the-art models by a significant margin while maintaining low computation costs. |
| Researcher Affiliation | Academia | Yuzhou Chen,1 Yulia R. Gel,2 H. Vincent Poor1 1Department of Electrical and Computer Engineering, Princeton University 2Department of Mathematical Sciences, University of Texas at Dallas |
| Pseudocode | No | The paper presents an architecture diagram (Figure 2) and describes the model components in text, but it does not include a formal pseudocode block or algorithm. |
| Open Source Code | Yes | Our datasets and codes are available on https://github.com/ BSc Nets/BSc Nets.git. |
| Open Datasets | Yes | We experiment on three types of networks (i) citation networks: Cora and Pub Med (Sen et al. 2008); (ii) social networks: (1) flight network: Airport (Chami et al. 2019), (2) criminal networks: Meetings and Phone Calls (Cavallaro et al. 2020), and (3) contact networks: High School network and Staff Community (Salath e et al. 2010; Eletreby et al. 2020); (iii) disease propagation tree: Disease (Chami et al. 2019). |
| Dataset Splits | Yes | Following (Chami et al. 2019), for all datasets, we randomly split edges into 85%/5%/10% for training, validation, and testing. |
| Hardware Specification | Yes | We implement our proposed BSc Nets with Pytorch framework on two NVIDIA RTX 3090 GPUs with 24 Gi B RAM. |
| Software Dependencies | No | The paper mentions implementing the model with the 'Pytorch framework' but does not specify its version or any other software dependencies with version numbers. |
| Experiment Setup | Yes | For all datasets, BSc Nets is trained by the Adam optimizer with the Cross Entropy Loss function. More details about the experimental setup and hyperparameters are in Appendix B.2. |