Hybrid-MST: A Hybrid Active Sampling Strategy for Pairwise Preference Aggregation
Authors: JING LI, Rafal Mantiuk, Junle Wang, Suiyi Ling, Patrick Le Callet
NeurIPS 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | The proposed method has been validated on both simulated and real-world datasets, where it shows higher preference aggregation ability than the state-of-the-art methods. |
| Researcher Affiliation | Collaboration | Jing Li LS2N/IPI Lab University of Nantes jingli.univ@gmail.com Rafal K. Mantiuk Computer Laboratory University of Cambridge rkm38@cam.ac.uk Junle Wang Turing Lab Tencent Games wangjunle@gmail.com Suiyi Ling, Patrick Le Callet LS2N/IPI Lab University of Nantes suiyi.ling, patrick.lecallet@univ-nantes.fr |
| Pseudocode | Yes | Algorithm 1 Hybrid-MST sampling algorithm |
| Open Source Code | Yes | Source code is public available in Github 1. 1Source code: https://github.com/jingnantes/hybrid-mst |
| Open Datasets | Yes | Video Quality Assessment(VQA) dataset This VQA dataset is a complete and balanced pairwise dataset from [38]. ... Image Quality Assessment (IQA) dataset This IQA dataset is a complete but imbalanced dataset from [26]. |
| Dataset Splits | No | The paper does not explicitly provide training/validation/test dataset splits in the conventional machine learning sense. It describes Monte Carlo simulations and uses complete real-world datasets for evaluating aggregation performance, rather than splitting a dataset into distinct sets for model training, validation, and testing. |
| Hardware Specification | Yes | All computations are done using MATLAB R2014b on a Mac Book Pro laptop, with 2.5GHz Intel Core i5, 8GB memory. |
| Software Dependencies | Yes | All computations are done using MATLAB R2014b on a Mac Book Pro laptop... |
| Experiment Setup | No | The paper describes aspects of the experimental setup, such as the number of simulated objects, noise distribution, and the threshold for switching between GM and MST methods. However, it does not provide specific hyperparameter values like learning rates, batch sizes, or optimizer settings, which are common details in experimental setup descriptions. |