Recursive Best-First Search with Bounded Overhead

Authors: Matthew Hatem, Scott Kiesel, Wheeler Ruml

AAAI 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We show empirically that this improves its performance in several domains, both for optimal and suboptimal search, and also yields a better linear-space anytime heuristic search. RBFSCR is the first linear space best-first search robust enough to solve a variety of domains with varying operator costs. Experiments We compared the new algorithms to standard RBFS and WIDA* (and WIDA*CR) on a variety of domains with different edge costs.
Researcher Affiliation Academia Matthew Hatem and Scott Kiesel and Wheeler Ruml Department of Computer Science University of New Hampshire Durham, NH 03824 USA mhatem and skiesel and ruml at cs.unh.edu
Pseudocode Yes Figure 1: Pseudo-code for RBFS. Figure 2: Pseudo-code for RBFSCR.
Open Source Code No The paper does not provide any explicit statements about open-sourcing its code or links to a code repository.
Open Datasets Yes First, we evaluated RBFSϵ, RBFSkthrt and RBFSCR on Korf s 100 15-puzzles (Korf 1985) using unit edge costs and the Manhattan distance heuristic.
Dataset Splits No The paper describes using problem instances like Korf's 100 15-puzzles and solving
Hardware Specification Yes All of our experiments were run on a machine with a dual-core Core Duo 3.16 GHz processor and 8 GB of RAM running Linux.
Software Dependencies Yes All algorithms were written in Java and compiled with Open JDK 1.6.0.24.
Experiment Setup Yes RBFSϵ with ϵ = 16 and RBFSkthrt with k = 5 are 1.7 times faster than RBFSCR on average because they have less overhead and also return cheaper solutions than IDA*CR. An initial suboptimality bound of 4.0 was selected to minimize time to first solution for both algorithms. We used an initial bound of 1.2.