Taming Binarized Neural Networks and Mixed-Integer Programs

8
citations
#983
in AAAI 2024
of 2289 papers
3
Top Authors
4
Data Points

Abstract

There has been a great deal of recent interest in binarized neural networks, especially because of their explainability. At the same time, automatic differentiation algorithms such as backpropagation fail for binarized neural networks, which limits their applicability. By reformulating the problem of training binarized neural networks as a subadditive dual of a mixed-integer program, we show that binarized neural networks admit a tame representation. This, in turn, makes it possible to use the framework of Bolte et al. for implicit differentiation, which offers the possibility for practical implementation of backpropagation in the context of binarized neural networks. This approach could also be used for a broader class of mixed-integer programs, beyond the training of binarized neural networks, as encountered in symbolic approaches to AI and beyond.

Citation History

Jan 28, 2026
0
Feb 13, 2026
8+8
Feb 13, 2026
8
Feb 13, 2026
8