The Effect of Manifold Entanglement and Intrinsic Dimensionality on Learning

Authors: Daniel Kienitz, Ekaterina Komendantskaya, Michael Lones7160-7167

AAAI 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We empirically investigate the effect of class manifold entanglement and the intrinsic and extrinsic dimensionality of the data distribution on the sample complexity of supervised classification with deep Re LU networks. We separate the effect of entanglement and intrinsic dimensionality and show statistically for artificial and real-world image datasets that the intrinsic dimensionality and the entanglement have an interdependent effect on the sample complexity.
Researcher Affiliation Academia Heriot-Watt University, Edinburgh, UK
Pseudocode No The paper includes mathematical equations and descriptions of procedures, but no explicitly labeled 'Pseudocode' or 'Algorithm' blocks.
Open Source Code No The paper does not provide any explicit statement about releasing source code or a link to a code repository.
Open Datasets Yes In this section we test the two entanglement measures introduced in the previous section on the real-world image benchmarks MNIST (Le Cun et al. 1990), FASHION (Xiao, Rasul, and Vollgraf 2017), SVHN (Netzer et al. 2011) and CIFAR-10.
Dataset Splits No The paper mentions 'train set' and 'test set' for measuring sample complexity but does not explicitly provide details on a validation set or specific split percentages for all three components (train/validation/test).
Hardware Specification No The paper does not provide specific hardware details such as GPU models, CPU types, or cloud computing specifications used for running experiments.
Software Dependencies No The paper mentions training with the 'Adam optimizer' and 'batchnormalization' but does not specify software dependencies with version numbers (e.g., 'PyTorch 1.x' or 'TensorFlow 2.x').
Experiment Setup Yes We train a fully-connected neural network on spiral datasets with independently changed ΣArch [1.0, 1.25, 1.5, 1.75, 2.0], I [1, 2, ..., 11] and E [2, 3, ..., 12] and measure the sample complexity ς. ... We train a convolutional neural network with batchnormalization (Ioffe and Szegedy 2015) on the binary classification tasks for ΣReal [0.1, 0.5, 1.0] and Iadd [0, 5, 10, 15, 30, 60, 90, 120, 150]...