NLocalSAT: Boosting Local Search with Solution Prediction

Authors: Wenjie Zhang, Zeyu Sun, Qihao Zhu, Ge Li, Shaowei Cai, Yingfei Xiong, Lu Zhang

IJCAI 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We evaluated NLocal SAT on five SLS solvers (CCAnr, Sparrow, CPSparrow, Yal SAT, and prob SAT) with instances in the random track of SAT Competition 2018. The experimental results show that solvers with NLocal SAT achieve 27% 62% improvement over the original SLS solvers.
Researcher Affiliation Academia 1Key Laboratory of High Confidence Software Technologies (Peking University), Mo E; Software Institute, Peking University, China 2State Key Laboratory of Computer Science, Institute of Software, Chinese Academy of Sciences, China 3School of Computer Science and Technology, University of Chinese Academy of Sciences, China
Pseudocode Yes Algorithm 1 Algorithm for stochastic local search solvers; Algorithm 2 Initialization of variables with NLocal SAT; Algorithm 3 Algorithm for stochastic local search solvers with NLocal SAT (The underlined code is the different part from the original solvers)
Open Source Code Yes The code is available at https://github.com/myxxxsquared/NLocal SAT
Open Datasets Yes Our model was trained on a dataset with generated instances with small SAT instances (denoted as Datasetsmall) and a dataset with instances in random tracks of SAT Competitions in 2012, 2013, 2014, 2016, 2017 (denoted as Datasetcomp). Our model was evaluated on instances in the random track of SAT Competition in 2018 (denoted as Dataseteval) with 255 SAT instances in total.
Dataset Splits Yes We generated 200 batches in the same approach with different random seeds as validation data during pretraining. We used Datasetcomp as the training dataset and the validation dataset.
Hardware Specification Yes Our experiments were performed on a work station with an Intel Xeon E52620 CPU and a TITAN RTX GPU.
Software Dependencies No The paper mentions using the Adam optimizer and LSTMCell but does not provide specific version numbers for software dependencies or libraries used for implementation.
Experiment Setup Yes In this paper, where d is the embedding size and d is set to 64 in this paper. We apply the GGCN layer of 16 iterations on the initial value and get a vector containing structural information about each literal. A two-layer perceptron MLP with hidden 512 size is applied on the vector for each variable to extract classification information from the structural information. In our experiments, p0 is set to 0.9. We set up a timeout limit to 1000 seconds.