Abstract
Out-of-distribution (OOD) detection has emerged as a pivotal approach for enhancing the reliability of machine learning models, considering the potential for test data to be sampled from classes disparate from in-distribution (ID) data employed during model training. Detecting those OOD data is typically realized as a distance measurement problem, where those deviating far away from the training distribution in the learned feature space are considered OOD samples. Advanced works have shown great success in learning with prototypes for feature-based OOD detection methods, where each ID class is represented with single or multiple prototypes. However, modeling with a finite number of prototypes would fail to maximally capture intra-class variations. In view of this, this paper extends the existing prototype-based learning paradigm to an infinite setting. This motivates us to design two feasible formulations for the Distributional Prototype Learning (DPL) objective, where, to avoid intractable computation and exploding parameters caused by the infinity nature, our key idea is to model an infinite number of discrete prototypes of each ID class with a class-wise continuous distribution. We theoretically analyze both alternatives, identifying the more stable-converging version of the learning objective. We show that, by sampling prototypes from a mixture of class-conditioned Gaussian distributions, the objective can be efficiently computed in a closed form without resorting to the computationally expensive Monte-Carlo approximation of the involved expectation terms. Extensive evaluations across mainstream OOD detection benchmarks empirically manifest that our proposed DPL has established a new state-of-the-art in various OOD settings.
Original language | English |
---|---|
Title of host publication | KDD '25 |
Subtitle of host publication | Proceedings of the 31st ACM SIGKDD Conference on Knowledge Discovery and Data Mining V.1 |
Place of Publication | New York |
Publisher | Association for Computing Machinery (ACM) |
Pages | 1104–1114 |
Number of pages | 11 |
ISBN (Print) | 9798400712456 |
DOIs | |
Publication status | Published - 20 Jul 2025 |
Event | 31st ACM SIGKDD Conference on Knowledge Discovery and Data Mining - Toronto, Canada Duration: 3 Aug 2025 → 7 Aug 2025 https://dl.acm.org/doi/proceedings/10.1145/3690624 (Conference Proceedings) https://kdd2025.kdd.org/ (Conference website) |
Publication series
Name | KDD: Proceedings of ACM SIGKDD Conference on Knowledge Discovery and Data Mining |
---|---|
Publisher | Association for Computing Machinery |
Conference
Conference | 31st ACM SIGKDD Conference on Knowledge Discovery and Data Mining |
---|---|
Abbreviated title | KDD 2025 |
Country/Territory | Canada |
City | Toronto |
Period | 3/08/25 → 7/08/25 |
Internet address |
|
User-Defined Keywords
- out-of-distribution detection
- prototypical learning