A Variational Information Theoretic Approach to Out-of-Distribution Detection

3
citations
#1367
in ICML 2025
of 3340 papers
3
Top Authors
4
Data Points

Abstract

We present a theory for the construction of out-of-distribution (OOD) detection features for neural networks. We introduce random features for OOD through a novel information-theoretic loss functional consisting of two terms, the first based on the KL divergence separates resulting in-distribution (ID) and OOD feature distributions and the second term is the Information Bottleneck, which favors compressed features that retain the OOD information. We formulate a variational procedure to optimize the loss and obtain OOD features. Based on assumptions on OOD distributions, one can recover properties of existing OOD features, i.e., shaping functions. Furthermore, we show that our theory can predict a new shaping function that out-performs existing ones on OOD benchmarks. Our theory provides a general framework for constructing a variety of new features with clear explainability.

Citation History

Jan 28, 2026
2
Feb 13, 2026
3+1
Feb 13, 2026
3
Feb 13, 2026
3