Deep Neural Networks via Complex Network Theory: A Perspective
Authors: Emanuele La Malfa, Gabriele La Malfa, Giuseppe Nicosia, Vito Latora
IJCAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In this section, we conduct experiments to assess to which extent CNT identifies patterns in DNNs: we define three complementary levels of analysis. The first level (I) aims to distinguish dominating CNT patterns for architecturally similar networks: we train on MNIST and CIFAR10 three-layer depth FCs, CNNs, RNNs and AEs equipped with the same activation functions and a comparable number of parameters. |
| Researcher Affiliation | Academia | 1University of Oxford 2King s College London 3University of Catania 4Queen Mary University of London |
| Pseudocode | No | The paper does not contain structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | Results for the other architectures on MNIST and CIFAR10, for five and nine layers, are reported in the code repository. |
| Open Datasets | Yes | We conduct all the experiments on two standard datasets in pattern recognition and computer vision, namely MNIST and CIFAR10 [Lecun and Bengio, 1995; Krizhevsky et al., 2010]. |
| Dataset Splits | No | The paper mentions using MNIST and CIFAR10 datasets but does not provide specific train/validation/test dataset splits or percentages. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running the experiments. |
| Software Dependencies | No | The paper does not provide specific ancillary software details with version numbers (e.g., library or solver names with version numbers) needed to replicate the experiment. |
| Experiment Setup | Yes | We initialise the weights of each DNN via sampling from a Gaussian distribution of known variance between 0.05 (MNIST) and 0.5 (CIFAR10).Results for the other architectures on MNIST and CIFAR10, for five and nine layers, are reported in the code repository. |