Distributional Prototype Learning for Out-of-distribution Detection

Bo Peng, Jie Lu, Yonggang Zhang, Guangquan Zhang, Zhen Fang

Research output: Chapter in book/report/conference proceedingConference proceedingpeer-review

Abstract

Out-of-distribution (OOD) detection has emerged as a pivotal approach for enhancing the reliability of machine learning models, considering the potential for test data to be sampled from classes disparate from in-distribution (ID) data employed during model training. Detecting those OOD data is typically realized as a distance measurement problem, where those deviating far away from the training distribution in the learned feature space are considered OOD samples. Advanced works have shown great success in learning with prototypes for feature-based OOD detection methods, where each ID class is represented with single or multiple prototypes. However, modeling with a finite number of prototypes would fail to maximally capture intra-class variations. In view of this, this paper extends the existing prototype-based learning paradigm to an infinite setting. This motivates us to design two feasible formulations for the Distributional Prototype Learning (DPL) objective, where, to avoid intractable computation and exploding parameters caused by the infinity nature, our key idea is to model an infinite number of discrete prototypes of each ID class with a class-wise continuous distribution. We theoretically analyze both alternatives, identifying the more stable-converging version of the learning objective. We show that, by sampling prototypes from a mixture of class-conditioned Gaussian distributions, the objective can be efficiently computed in a closed form without resorting to the computationally expensive Monte-Carlo approximation of the involved expectation terms. Extensive evaluations across mainstream OOD detection benchmarks empirically manifest that our proposed DPL has established a new state-of-the-art in various OOD settings.
Original languageEnglish
Title of host publicationKDD '25
Subtitle of host publicationProceedings of the 31st ACM SIGKDD Conference on Knowledge Discovery and Data Mining V.1
Place of PublicationNew York
PublisherAssociation for Computing Machinery (ACM)
Pages1104–1114
Number of pages11
ISBN (Print)9798400712456
DOIs
Publication statusPublished - 20 Jul 2025
Event31st ACM SIGKDD Conference on Knowledge Discovery and Data Mining - Toronto, Canada
Duration: 3 Aug 20257 Aug 2025
https://dl.acm.org/doi/proceedings/10.1145/3690624 (Conference Proceedings)
https://kdd2025.kdd.org/ (Conference website)

Publication series

NameKDD: Proceedings of ACM SIGKDD Conference on Knowledge Discovery and Data Mining
PublisherAssociation for Computing Machinery

Conference

Conference31st ACM SIGKDD Conference on Knowledge Discovery and Data Mining
Abbreviated titleKDD 2025
Country/TerritoryCanada
CityToronto
Period3/08/257/08/25
Internet address

User-Defined Keywords

  • out-of-distribution detection
  • prototypical learning

Fingerprint

Dive into the research topics of 'Distributional Prototype Learning for Out-of-distribution Detection'. Together they form a unique fingerprint.

Cite this