RANSAC versus CS-RANSAC

Authors: Geun Jo, Kee-Sung Lee, Devy Chandra, Chol-Hee Jang, Myung-Hyun Ga

AAAI 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental The experimental results show that the proposed CS-RANSAC algorithm can outperform the most of variants of RANSAC without sacrificing its execution time.Images from UKBench dataset (Nister and Stewenius 2006) with a size of are used for evaluation.
Researcher Affiliation Academia Department of Computer & Information Engineering INHA University Incheon, Korea gsjo@inha.ac.kr, lee.ks@outlook.kr, wiy_ch4n@hotmail.com, orange@eslab.inha.ac.kr, gagaman7777@eslab.inha.ac.kr
Pseudocode Yes Algorithm 1 CS-RANSAC.
Open Source Code No The paper does not provide any explicit statement or link indicating that the source code for the described methodology is open-source or publicly available.
Open Datasets Yes Images from UKBench dataset (Nister and Stewenius 2006) with a size of are used for evaluation.
Dataset Splits No The paper uses images from the UKBench dataset for evaluation but does not specify training, validation, or test dataset splits or a cross-validation setup.
Hardware Specification Yes Processing time is the time required by each algorithm to generate a homography matrix for each image pair, measured on Intel Core I5 CPU 1.80GHz.
Software Dependencies No The paper mentions using SURF features and calculating homography matrices but does not provide specific software dependencies with version numbers (e.g., Python, OpenCV, specific libraries).
Experiment Setup Yes Input: Extracted features from the input image, the maximum number of sampling iterations , the error threshold .Here, is set to 100 sampling iterations to avoid the algorithm to loop foreverwhere is the probability of all sampled features being inliers, and is set to to obtain high accuracy.These results therefore suggest four as the optimal number of samples and 17 17 as the optimal grid size, as shown in Table 2.