Optimal Query Complexities for Dynamic Trace Estimation

Authors: David Woodruff, Fred Zhang, Richard Zhang

NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We experimentally validate our algorithmic results. We compare Algorithm 1, with the following procedures on both synthetic and real datasets.
Researcher Affiliation Collaboration David P. Woodruff Carnegie Mellon University dwoodruf@cs.cmu.edu Fred Zhang UC Berkeley z0@berkeley.edu Qiuyi (Richard) Zhang Google Brain qiuyiz@google.com
Pseudocode Yes Algorithm 1: Improved Dynamic Trace Estimation; Algorithm 2: SUMTREE: Helper Function for Tracing the Binary Tree
Open Source Code No The paper states that code is included in the supplemental material or as a URL for reproducing experimental results, but does not provide a direct link or explicit statement for the methodology described within the main text.
Open Datasets Yes We use two ar Xiv collaboration networks with 5, 242 and 9, 877 nodes [17]. ... Both are available at https://sparse.tamu.edu/SNAP. ... We train the network on the MNIST dataset via mini-batch SGD
Dataset Splits No The paper mentions using specific datasets (SNAP, MNIST) and training, but does not provide explicit details on training, validation, or test data splits.
Hardware Specification No The paper states that the total amount of compute and type of resources used are included, but the main text does not specify exact hardware details such as GPU/CPU models or processor types.
Software Dependencies No The paper does not provide specific software dependencies or version numbers for libraries, frameworks, or programming languages used in the experiments.
Experiment Setup No The paper states that training details and hyperparameters are specified, but these details are not found in the main text of the paper, possibly in an appendix or supplementary material.