Space-Time Local Embeddings

Authors: Ke Sun, Jun Wang, Alexandros Kalousis, Stephane Marchand-Maillet

NeurIPS 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Empirical results on nonmetric datasets show that more information can be preserved in space-time.
Researcher Affiliation Collaboration 1 Viper Group, Computer Vision and Multimedia Laboratory, University of Geneva sunk.edu@gmail.com, Stephane.Marchand-Maillet@unige.ch, and 2 Expedia, Switzerland, jwang1@expedia.com, and 3 Business Informatics Department, University of Applied Sciences, Western Switzerland, Alexandros.Kalousis@hesge.ch
Pseudocode No The paper does not contain any structured pseudocode or algorithm blocks.
Open Source Code No The paper does not provide any explicit statements about the release of source code or links to a code repository for the described methodology.
Open Datasets Yes NIPS22 contains a 4197 3624 author-document matrix from NIPS 1988 to 2009 [2]. Gr Qc is an ar Xiv co-authorship graph [16]. W5000 is the semantic similarities among 5000 English words in WS5000 5000 [2, 17].
Dataset Splits No The paper does not specify distinct training, validation, or test dataset splits. The evaluation is primarily based on KL divergence on the input similarity matrix, which represents the entire dataset.
Hardware Specification No The paper does not provide specific details about the hardware (e.g., CPU, GPU models, memory) used for running the experiments.
Software Dependencies No The paper does not specify any software dependencies with version numbers.
Experiment Setup Yes During gradient descent, {ys i } are updated by the delta-bar-delta scheme as used in t-SNE [13], where each scalar parameter has its own adaptive learning rate initialized to γs > 0; {yt i} are updated based on one global adaptive learning rate initialized to γt > 0. The learning of time should be more cautious, because pij(Y ) is more sensitive to time variations by eq. (7). Therefore, the ratio γt/γs should be very small, e.g. 1/100. the minimal KL that we have achieved within 5000 epochs is shown.