Bridging Machine Learning and Logical Reasoning by Abductive Learning

Authors: Wang-Zhou Dai, Qiuling Xu, Yang Yu, Zhi-Hua Zhou

NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimental results show that ABL generalise better than state-of-the-art deep learning models and can leverage learning and reasoning in a mutually beneficial way. Further experiments on a visual n-queens task shows that the ABL framework is flexible and can improve the performance of machine learning by taking advantage of classical symbolic AI systems such as Constraint Logic Programming [16].
Researcher Affiliation Academia Wang-Zhou Dai Qiuling Xu Yang Yu Zhi-Hua Zhou National Key Laboratory for Novel Software Technology Nanjing University, Nanjing 210023, China {daiwz, xuql, yuy, zhouzh}@lamda.nju.edu.cn
Pseudocode No Not found. The paper describes the ABL approach in text and with mathematical formulations but does not include any pseudocode or algorithm blocks.
Open Source Code No Not found. The paper does not contain any statement about making the source code publicly available or provide a link to a code repository for the methodology described.
Open Datasets Yes The Digital Binary Additive (DBA) equations were created with images from benchmark handwritten character datasets [22, 36], while the Random Symbol Binary Additive (RBA) equations were constructed from randomly selected characters sets of the Omniglot dataset [21] and shared isomorphic structure with the equations in the DBA tasks.
Dataset Splits No Not found. The paper states that 'All the neural networks are tuned with a held-out validation set randomly sampled from the training data.' but does not provide specific percentages or counts for this validation split, nor does it specify the method to reproduce the split beyond 'randomly sampled'.
Hardware Specification Yes All the experiment are repeated for 10 times and performed on a workstation with a 16 core Intel Xeon CPU @ 2.10GHz, 32 GB memory and a Nvidia Titan Xp GPU.
Software Dependencies No Not found. The paper mentions using 'Prolog' and 'RACOS[40]' (a derivative-free optimization tool) but does not provide specific version numbers for these or any other software dependencies.
Experiment Setup Yes Fig. 5 shows the architecture of our ABL implementation, which employs a convolutional neural network (CNN) [22] as the perception machine learning model. The CNN takes image pixels as input and is expected to output the symbols in the image. The logical abduction is realised by an Abductive Logic Program implemented with Prolog. The consistency optimisation problem in Eq. 5 is solved by a derivative-free optimisation tool RACOS[40]. The machine learning model of ABL consists of a two-layer CNN and a two-layer multiple-layer perceptron (MLP) followed by a softmax layer; the logical abduction will keep 50 calculation rule sets of bit-wise operations set as relational features; The decision model is a two-layer MLP.