Optimizing persistent homology based functions

Authors: Mathieu Carriere, Frederic Chazal, Marc Glisse, Yuichi Ike, Hariprasad Kannan, Yuhei Umeda

ICML 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental as well as some experiments showcasing the versatility of our approach.
Researcher Affiliation Collaboration 1Universit e Cˆote d Azur, Inria, France 2Universit e Paris-Saclay, CNRS, Inria, Laboratoire de Math ematiques d Orsay, France 3Fujitsu Ltd., Kanagawa, Japan.
Pseudocode Yes Algorithm 1 Persistence pairs computation (sketch)
Open Source Code Yes It is publicly available at https://github.com/ Mathieu Carriere/difftda
Open Datasets Yes We classify images from the MNIST data set. Scores do not have standard deviations since we use the train/test splits of the mnist.load data function in Tensor Flow 2.
Dataset Splits Yes Scores do not have standard deviations since we use the train/test splits of the mnist.load data function in Tensor Flow 2.
Hardware Specification No The paper does not provide specific hardware details (e.g., GPU/CPU models, processor types, or memory amounts) used for running its experiments.
Software Dependencies No The paper mentions 'Gudhi' and 'Tensor Flow 2' but does not provide specific version numbers for these software dependencies, which are necessary for full reproducibility.
Experiment Setup Yes In this experiment, we start with a point cloud X sampled uniformly from the unit square S = [0, 1]2, and then optimize the point coordinates so that the loss L(X) = P(X) + T(X) is minimized. Then train an autoencoder made of four fully-connected layers with 32 neurons and Re LU activations and using the first five persistence landscapes with resolution 100.