Isometric Manifold Learning Using Hierarchical Flow

Authors: Ziqi Pan, Jianfu Zhang, Li Niu, Liqing Zhang

AAAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimental results justify our theoretical analysis, demonstrate the superiority of our dimensionality reduction algorithm in cost of training time, and verify the effect of the aforementioned properties in improving performances on downstream tasks such as anomaly detection.
Researcher Affiliation Academia Ziqi Pan, Jianfu Zhang*, Li Niu*, Liqing Zhang Mo E Key Lab of Artificial Intelligence, Department of Computer Science and Engineering Shanghai Jiao Tong University, Shanghai, China {panziqi ai, c.sis, ustcnewly}@sjtu.edu.cn, zhang-lq@cs.sjtu.edu.cn
Pseudocode Yes Algorithm 1: Two-stage Dimensionality Reduction
Open Source Code No The paper does not provide an explicit statement or link to open-source code for the methodology described.
Open Datasets No The paper describes using a 'synthetic manifold' for intuitive illustration: 'We use a 1-dimensional manifold M (cos θ, sin θ) |θ π 6 (i.e., a curve) residing in the 2-dimensional Euclidean space. We draw N = 1, 000 samples from a Gaussian density N π 2 , 1 that is restricted to M, obtaining a training dataset X n x(i) N π 2 , 1 x(i) M o N which we use to train different models.' This is a generated dataset for which no public access information is provided. While it mentions 'natural image datasets' in supplementary, no specific public access details for those are given in the main text.
Dataset Splits No The paper mentions obtaining a 'training dataset' but does not specify any training, validation, or test splits (e.g., percentages, counts, or a predefined split citation) in the main text.
Hardware Specification No The paper does not explicitly describe the specific hardware (e.g., GPU/CPU models, memory) used to run its experiments.
Software Dependencies No The paper does not provide specific software names with version numbers for reproducibility.
Experiment Setup No The paper states 'All the implementation details for our experiments can also be found in supplementary', but it does not provide specific experimental setup details, such as hyperparameter values or system-level training settings, within the main text.