TY - JOUR
T1 - Characterizing Submanifold Region for Out-of-Distribution Detection
AU - Li, Xuhui
AU - Fang, Zhen
AU - Zhang, Yonggang
AU - Ma, Ning
AU - Bu, Jiajun
AU - Han, Bo
AU - Wang, Haishuai
N1 - This work was supported in part by the National Key R&D Program of China under Grant 2022ZD0160703, in part by the National Natural Science Foundation of China under Grant 62202422 and under Grant 62372408, and in part by Shanghai Artificial Intelligence Laboratory.
Publisher Copyright:
© 2024 IEEE.
PY - 2025/1
Y1 - 2025/1
N2 - Detecting out-of-distribution (OOD) samples poses a significant safety challenge when deploying models in open-world scenarios. Advanced works assume that OOD and in-distributional (ID) samples exhibit a distribution discrepancy, showing an encouraging direction in estimating the uncertainty with embedding features or predicting outputs. Besides incorporating auxiliary outlier as decision boundary, quantifying a 'meaningful distance' in embedding space as uncertainty measurement is a promising strategy. However, these distances-based approaches overlook the data structure and heavily rely on the high-dimension features learned by deep neural networks, causing unreliable distances due to the 'curse of dimensionality'. In this work, we propose a data structure-aware approach to mitigate the sensitivity of distances to the 'curse of dimensionality', where high-dimensional features are mapped to the manifold of ID samples, leveraging the well-known manifold assumption. Specifically, we present a novel distance termed as tangent distance, which tackles the issue of generalizing the meaningfulness of distances on testing samples to detect OOD inputs. Inspired by manifold learning for adversarial examples, where adversarial region probability density is close to the orthogonal direction of the manifold, and both OOD and adversarial samples have common characteristic-imperceptible perturbations with shift distribution, we propose that OOD samples are relatively far away from the ID manifold, where tangent distance directly computes the Euclidean distance between samples and the nearest submanifold space-instantiated as the linear approximation of local region on the manifold. We provide empirical and theoretical insights to demonstrate the effectiveness of OOD uncertainty measurements on the low-dimensional subspace. Extensive experiments show that the tangent distance performs competitively with other post hoc OOD detection baselines on common and large-scale benchmarks, and the theoretical analysis supports our claim that ID samples are likely to reside in high-density regions, explaining the effectiveness of internal connections among ID data.
AB - Detecting out-of-distribution (OOD) samples poses a significant safety challenge when deploying models in open-world scenarios. Advanced works assume that OOD and in-distributional (ID) samples exhibit a distribution discrepancy, showing an encouraging direction in estimating the uncertainty with embedding features or predicting outputs. Besides incorporating auxiliary outlier as decision boundary, quantifying a 'meaningful distance' in embedding space as uncertainty measurement is a promising strategy. However, these distances-based approaches overlook the data structure and heavily rely on the high-dimension features learned by deep neural networks, causing unreliable distances due to the 'curse of dimensionality'. In this work, we propose a data structure-aware approach to mitigate the sensitivity of distances to the 'curse of dimensionality', where high-dimensional features are mapped to the manifold of ID samples, leveraging the well-known manifold assumption. Specifically, we present a novel distance termed as tangent distance, which tackles the issue of generalizing the meaningfulness of distances on testing samples to detect OOD inputs. Inspired by manifold learning for adversarial examples, where adversarial region probability density is close to the orthogonal direction of the manifold, and both OOD and adversarial samples have common characteristic-imperceptible perturbations with shift distribution, we propose that OOD samples are relatively far away from the ID manifold, where tangent distance directly computes the Euclidean distance between samples and the nearest submanifold space-instantiated as the linear approximation of local region on the manifold. We provide empirical and theoretical insights to demonstrate the effectiveness of OOD uncertainty measurements on the low-dimensional subspace. Extensive experiments show that the tangent distance performs competitively with other post hoc OOD detection baselines on common and large-scale benchmarks, and the theoretical analysis supports our claim that ID samples are likely to reside in high-density regions, explaining the effectiveness of internal connections among ID data.
KW - Out-of-distribution detection (OOD)
KW - distance metric
KW - manifold
KW - tangent space
UR - https://www.scopus.com/record/display.uri?eid=2-s2.0-85205955996&origin=inward
U2 - 10.1109/TKDE.2024.3468629
DO - 10.1109/TKDE.2024.3468629
M3 - Journal article
SN - 2326-3865
VL - 37
SP - 130
EP - 147
JO - IEEE Transactions on Knowledge and Data Engineering
JF - IEEE Transactions on Knowledge and Data Engineering
IS - 1
ER -