Graph Geometry-Preserving Autoencoders
Authors: Jungbin Lim, Jihwan Kim, Yonghyeon Lee, Cheongjae Jang, Frank C. Park
ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Through extensive experiments, we show that our method outperforms existing state-of-the-art geometrypreserving and graph-based autoencoders with respect to learning accurate latent structures that preserve the graph geometry, and is particularly effective in learning dynamics in the latent space. |
| Researcher Affiliation | Academia | 1Department of Mechanical Engineering, Seoul National University, Seoul, Republic of Korea 2Center for AI and Natural Sciences, Korea Institute for Advanced Study, Seoul, Republic of Korea 3The AI Institute, Hanyang University, Seoul, Republic of Korea. |
| Pseudocode | Yes | In this section, we provide python-style pseudocode for Batch-Kernel and Laplacian-Slicing. Batch-Kernel chooses a submatrix of the kernel matrix to compute the batch Laplacian, whereas Laplacian-Slicing computes the Laplacian first and then chooses a submatrix of the Laplacian matrix. The functions Equation7, Equation8, and Equation18 represent the corresponding equations from the main text. Listing 1. pseudocode for Batch-Kernel Listing 2. pseudocode for Laplacian-Slicing |
| Open Source Code | Yes | Code is available at https://github.com/ Jungbin Lim/GGAE-public. |
| Open Datasets | Yes | The Swiss Roll dataset consists of randomly sampled points on a two-dimensional, spiral-shaped manifold in R3, as shown in Figure 2. The d Sprites dataset (Matthey et al., 2017) consists of 2D synthetic images generated from six ground truth factors of variation; color, shape, rotation, scale, x and y positions of a 2D shape. Our Rotating MNIST dataset consists of 36-frame videos of handwritten digit 3 , with each frame created by rotating an image of 3 by 10 per step. |
| Dataset Splits | No | The paper does not explicitly state training, validation, and test splits with percentages or absolute counts. It mentions 'mini-batch B X of b := |B| N data points' in the context of approximation but not formal dataset splits. |
| Hardware Specification | Yes | The experiments utilized an i9-10900K CPU and a Geforce RTX 4090 GPU. For the computation of shortest path distances on the graph, we employ the Dijkstra algorithm, implemented in the scipy Python library. The time complexity of Dijkstra s algorithm is O(V E + V 2 log V ), where V and E represent the number of nodes and edges in the graph, respectively. As an example, for the Swiss Roll dataset with 10k data points and a k-nearest neighbor graph constructed with k = 10, resulting in approximately 50k edges, it takes approximately 23.5 seconds to compute the shortest path distances between all nodes using an AMD Ryzen 9 7950X CPU. |
| Software Dependencies | No | The paper mentions 'scipy Python library' for Dijkstra algorithm but does not provide specific version numbers for Python or other libraries/frameworks like PyTorch or TensorFlow. |
| Experiment Setup | No | The paper does not explicitly provide specific hyperparameter values (e.g., learning rate, number of epochs, optimizer settings) or other detailed system-level training configurations in the main text, though it mentions `alpha > 0 is a weight parameter` for the loss function. |