Abstract
For long-tailed distributed data, existing classification models often learn overwhelmingly on the head classes while ignoring the tail classes, resulting in poor generalization capability. To address this problem, we thereby propose a new approach in this paper, in which a key point sensitive (KPS) loss is presented to regularize the key points strongly to improve the generalization performance of the classification model. Meanwhile, in order to improve the performance on tail classes, the proposed KPS loss also assigns relatively large margins on tail classes. Furthermore, we propose a gradient adjustment (GA) optimization strategy to re-balance the gradients of positive and negative samples for each class. By virtue of the gradient analysis of the loss function, it is found that the tail classes always receive negative signals during training, which misleads the tail prediction to be biased towards the head. The proposed GA strategy can circumvent excessive negative signals on tail classes and further improve the overall classification accuracy. Extensive experiments conducted on long-tailed benchmarks show that the proposed method is capable of significantly improving the classification accuracy of the model in tail classes while maintaining competent performance in head classes.
Original language | English |
---|---|
Pages (from-to) | 4812-4825 |
Number of pages | 14 |
Journal | IEEE Transactions on Pattern Analysis and Machine Intelligence |
Volume | 45 |
Issue number | 4 |
Early online date | 3 Aug 2022 |
DOIs | |
Publication status | Published - 1 Apr 2023 |
Scopus Subject Areas
- Software
- Artificial Intelligence
- Applied Mathematics
- Computer Vision and Pattern Recognition
- Computational Theory and Mathematics
User-Defined Keywords
- Long-tailed classification
- class imbalance
- imbalance learning
- long-tailed visual recognition