Riemannian Neural SDE: Learning Stochastic Representations on Manifolds

Authors: Sung Woo Park, Hyomin Kim, Kyungjae Lee, Junseok Kwon

NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In this section, we compared the proposed RNSDE to baseline methods for various tasks, including generative modeling, interpolation, and reconstruction. In Tables 1, 2, and 3, we estimated the 2-Wasserstein distance (i.e., W2) between the empirical target measure νt and the model measure µθ t to evaluate the model performance.
Researcher Affiliation Collaboration 1,2,4School of Computer Science and Engineering, Chung-Ang University, Korea 1,3,4Artificial Intelligence Graduate School, Chung-Ang University, Korea 1LG AI Research
Pseudocode Yes Algorithm 1 Learning Representations on Manifolds with the proposed RNSDE
Open Source Code No 3. If you ran experiments... (a) Did you include the code, data, and instructions needed to reproduce the main experimental results (either in the supplemental material or as a URL)? [No]
Open Datasets Yes As the third application, we modeled the density of a volcano eruption dataset (19), which has been explored in previous studies works (10; 17; 24). NOAA. Global significant volcano database. 'https://data.nodc.noaa.gov/cgi-bin/ iso?id=gov.noaa.ngdc.mgg.hazards:G10147'.
Dataset Splits No The paper states in its checklist that training details including data splits were specified, but the main text describing experiments (Section 6) does not explicitly provide percentages or counts for train/validation/test splits for any of the datasets (synthetic shapes, volcano, vessel route). For generative modeling, it describes the target densities but not specific data splits for training and validation.
Hardware Specification No The paper's checklist indicates that the type of resources used was included, but the main body of the paper does not explicitly provide specific hardware details such as GPU models (e.g., 'NVIDIA A100'), CPU models, or detailed cloud instance specifications.
Software Dependencies No The paper does not explicitly provide specific software dependencies with version numbers, such as 'PyTorch 1.9' or 'Python 3.8', needed to replicate the experiments.
Experiment Setup Yes The paper specifies training details such as: 'Every methods take the initial states of the stochastic trajectories as the mean-zero standard Gaussian in the local coordinate, Xθ 0 µ0 := ψ#N(0, In)', 'To train EMSRE, the number of transformation modules and radial components were set to NT = 24 and K = 5, respectively.', and 'The total interpolation time (T) was set to 0.1, and 32 intermediate samples were taken in the sequence.' Algorithm 1 also lists 'learning rate κ, time interval t, stopping time threshold c1'.