Manifold structure in graph embeddings

Authors: Patrick Rubin-Delanchy

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Figure 1 shows point clouds obtained by adjacency spectral embedding graphs from three latent position network models on R (with n = 5000, ˆD = 3). For experimental parameters a = 5, b = 2, normal(0, 1), the kernel f(x, y) = 1 exp( 2xy) (seen earlier), ˆD = 100 and n = 5000, split into 3,000 training and 2,000 test examples, the out-ofsample mean square error (MSE) of four methods are compared: a feedforward neural network (using default R keras configuration with obvious adjustments for input dimension and loss function; MSE 1.25); the random forest [10] (default configuration of the R package random Forest; MSE 1.11); the Lasso [67] (default R glmnet configuration; MSE 1.58); and least-squares (MSE 1.63).
Researcher Affiliation Academia Patrick Rubin-Delanchy University of Bristol patrick.rubin-delanchy@bristol.ac.uk
Pseudocode No No pseudocode or algorithm blocks were found in the paper.
Open Source Code No The paper does not provide an explicit statement or link to open-source code for the methodology described. It mentions using existing R packages (e.g., 'R package igraph', 'R keras configuration', 'R package random Forest', 'R glmnet configuration', 'R package ks', 'R package intrinsicDimension') but not its own implementation code.
Open Datasets Yes Figure 2: Non-linear dimension reduction of spectral embeddings. a) Graph of computer-to-computer network flow events on the Los Alamos National Laboratory network, from the publically available dataset [36] ... c) graph of consumer-restaurant ratings... extracted from the publically available Yelp dataset
Dataset Splits No The paper mentions 'split into 3,000 training and 2,000 test examples' but does not explicitly state a separate validation split or subset.
Hardware Specification No The paper does not provide any specific hardware details (e.g., GPU/CPU models, memory) used for running its experiments.
Software Dependencies No The paper mentions several R packages (e.g., 'R keras', 'R package random Forest', 'R glmnet', 'R package igraph', 'R package ks', 'R package intrinsicDimension') but does not specify their version numbers.
Experiment Setup Yes For experimental parameters a = 5, b = 2, normal(0, 1), the kernel f(x, y) = 1 exp( 2xy) (seen earlier), ˆD = 100 and n = 5000, split into 3,000 training and 2,000 test examples, the out-ofsample mean square error (MSE) of four methods are compared: a feedforward neural network (using default R keras configuration with obvious adjustments for input dimension and loss function; MSE 1.25); the random forest [10] (default configuration of the R package random Forest; MSE 1.11); the Lasso [67] (default R glmnet configuration; MSE 1.58); and least-squares (MSE 1.63).