Taming Binarized Neural Networks and Mixed-Integer Programs

Authors: Johannes Aspman, Georgios Korpas, Jakub Marecek

AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We show that binarized neural networks admit a tame representation by reformulating the problem of training binarized neural networks as a subadditive dual of a mixed-integer program, which we show to have nice properties. This makes it possible to use the framework of Bolte et al. for implicit differentiation, which offers the possibility for practical implementation of backpropagation in the context of binarized neural networks. This approach could also be used for a broader class of mixed-integer programs, beyond the training of binarized neural networks, as encountered in symbolic approaches to AI and beyond.
Researcher Affiliation Collaboration Johannes Aspman1, Georgios Korpas1,2, Jakub Marecek1 1Department of Computer Science, Czech Technical University, Prague, Czechia 2 HSBC Lab, Innovation & Ventures, HSBC Holdings, London, United Kingdom
Pseudocode No The paper does not contain structured pseudocode or algorithm blocks.
Open Source Code No The paper does not provide concrete access to source code for the methodology described.
Open Datasets No The paper presents a theoretical framework and uses an illustrative example rather than a specific public dataset for training.
Dataset Splits No The paper is theoretical and does not provide specific dataset split information for training, validation, or testing.
Hardware Specification No The paper does not provide specific hardware details used for running experiments.
Software Dependencies No The paper does not provide specific ancillary software details with version numbers.
Experiment Setup No The paper is theoretical and does not contain specific experimental setup details like hyperparameter values or training configurations.