Learning Large Logic Programs By Going Beyond Entailment
Authors: Andrew Cropper, Sebastijan Dumančic
IJCAI 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our experiments on three diverse program synthesis domains (robot planning, string transformations, and ASCII art), show that Brute can substantially outperform existing ILP systems, both in terms of predictive accuracies and learning times, and can learn programs 20 times larger than state-of-the-art systems. |
| Researcher Affiliation | Academia | University of Oxford 2KU Leuven andrew.cropper@cs.ox.ac.uk, sebastijan.dumancic@cs.kuleuven.be |
| Pseudocode | Yes | Algorithm 1 Brute 1 def brute(e+,e ,bk,L): 2 library = invent (bk) 3 prog = search(e+,e ,bk,L,library) 4 return prog 5 6 def search(e+,e ,bk,L,library): 7 queue = empty priority queue () 8 initial spec = {(x, y) | p(x, y) e+} 9 initial hypothesis = [] 10 initial state = ( initial spec , initial hypothesis ) 11 initial loss = P (x,y) initial spec L(x, y) 12 queue.push( initial loss , initial state ) 13 14 while not queue.empty(): 15 loss , state = queue.pop() 16 (spec, hypothesis ) = state 17 18 if loss == 0 and consistent ( hypothesis ,e ,bk): 19 return hypothesis + induce target ( hypothesis ) 20 21 for library predicate in library : 22 new spec = apply( library predicate ,bk,spec) 23 new hypothesis = hypothesis + library predicate 24 new loss = P (x,y) new spec L(x, y) 25 new state = (new spec,new hypothesis) 26 q.push(new loss, new state ) 27 return [] |
| Open Source Code | No | The paper mentions that Brute is implemented in Prolog and cites the Metagol system's GitHub repository, but it does not provide an explicit statement or link for the open-source code of Brute itself. |
| Open Datasets | Yes | Our second experiment is on real-world string transformations, a domain often used to evaluate program synthesis systems [Lin et al., 2014; Balog et al., 2017; Ellis et al., 2019]. Materials We use 130 string transformation tasks from [Cropper, 2019]. Each task has 10 examples. |
| Dataset Splits | No | The paper specifies training and testing examples ('For each task and for each n in {1, 3, 5, 7, 9}, we sample uniformly without replacement n examples as training examples and use the other 10 n examples as testing examples.'), but it does not mention a separate validation split. |
| Hardware Specification | No | The paper states 'Brute can learn programs in under 60 seconds using a standard single-core computer.' and 'REPL takes multiple days to train on a single domain using one P100 GPU.', but this does not provide specific hardware models (e.g., CPU, GPU model numbers) used for their own experiments. |
| Software Dependencies | No | The paper mentions 'Brute is implemented in Prolog' and 'text2art library with the font 3x5' but does not specify version numbers for Prolog or the text2art library, which are necessary for full reproducibility. |
| Experiment Setup | Yes | We restrict Brute to only invent library predicates with at most two clauses, where each clause has at most three variables and at most two body literals. Metagol. Metagol uses metarules [Muggleton et al., 2015], higher-order Horn clauses, to guide the proof search. We provide Metagol with the ident, precon, postcon, chain, and tailrec metarules, which are commonly used in the Metagol literature. We also force Metagol to learn functional logic programs [Lin et al., 2014], which ensures that for any induced program and for any input value there is exactly one output value. We enforce a learning timeout of 60 seconds per task. We repeat each experiment 10 times and plot 95% confidence intervals. |