DeepSaDe: Learning Neural Networks That Guarantee Domain Constraint Satisfaction
Authors: Kshitij Goyal, Sebastijan Dumancic, Hendrik Blockeel
AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Evaluation on various machine learning tasks demonstrates that our approach is flexible enough to enforce a wide variety of domain constraints and is able to guarantee them in neural networks. and Afterward, we compare our approach to related work and evaluate it experimentally. and 6 Experiments We evaluate multiple use cases in various ML tasks with complex domain constraints. |
| Researcher Affiliation | Academia | 1KU Leuven, Belgium 2Delft University of Technology, The Netherlands |
| Pseudocode | Yes | Algorithm 1: Deep Satisfiability Descent (Deep Sa De) |
| Open Source Code | No | No explicit statement about providing open-source code or a link to a code repository for the methodology described in this paper was found. |
| Open Datasets | Yes | UC4: A multi-label classification problem of identifying the labels from a sequence of 4 MNIST images. |
| Dataset Splits | Yes | and the data is split 70/20/10 into train/test/validation. |
| Hardware Specification | Yes | We ran experiments on an Intel(R) Xeon(R) Silver 4214 CPU @ 2.20GHz machine with 125 GB RAM. |
| Software Dependencies | No | The paper mentions 'Z3 solver' but does not specify its version number. No other software dependencies are listed with specific versions. |
| Experiment Setup | No | Refer to appendix A.2 for details on the architectures and hyper-parameters. The main text itself does not provide concrete hyperparameter values or detailed training configurations. |