Log-Hilbert-Schmidt metric between positive definite operators on Hilbert spaces

Authors: Minh Ha Quang, Marco San Biagio, Vittorio Murino

NeurIPS 2014 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Empirically, we apply our formulation to the task of multi-category image classification, where each image is represented by an infinite-dimensional RKHS covariance operator. On several challenging datasets, our method significantly outperforms approaches based on covariance matrices computed directly on the original input features, including those using the Log-Euclidean metric, Stein and Jeffreys divergences, achieving new state of the art results.
Researcher Affiliation Academia H a Quang Minh Marco San Biagio Vittorio Murino Istituto Italiano di Tecnologia Via Morego 30, Genova 16163, ITALY {minh.haquang,marco.sanbiagio,vittorio.murino}@iit.it
Pseudocode No The paper does not contain any structured pseudocode or algorithm blocks.
Open Source Code No The paper does not provide an explicit statement about the release of source code for the methodology or a link to a code repository.
Open Datasets Yes Kylberg texture dataset [13], KTH-TIPS2b dataset [6], Fish Recognition dataset [5]
Dataset Splits Yes For all experiments, the kernel parameters were chosen by cross validation, while the regularization parameters were fixed to be γ = µ = 10 8. We randomly selected 5 images in each class for training and used the remaining ones as test data, repeating the entire procedure 10 times.
Hardware Specification No The paper does not provide specific hardware details (e.g., CPU/GPU models, memory) used for running the experiments.
Software Dependencies No The paper mentions using LIBSVM [7] but does not provide specific version numbers for any software dependencies.
Experiment Setup Yes For all experiments, the kernel parameters were chosen by cross validation, while the regularization parameters were fixed to be γ = µ = 10 8.