TY - JOUR
T1 - Extended T: Learning with Mixed Closed-set and Open-set Noisy Labels
AU - Xia, Xiaobo
AU - Han, Bo
AU - Wang, Nannan
AU - Deng, Jiankang
AU - Li, Jiatong
AU - Mao, Yinian
AU - Liu, Tongliang
N1 - Funding Information:
The work of Bo Han was supported in part by the RGC Early Career Scheme under Grant 22200720, in part by the NSFC Young Scientists Fund under Grant 62006202, in part by the Guangdong Basic and Applied Basic Research Foundation under Grant 2022A1515011652, and in part by the HKBU CSD Departmental Incentive Grant. The work of Nannan Wang was supported in part by the National Natural Science Foundation of China under Grants 61922066 and 61876142, in part by the Technology Innovation Leading Program of Shaanxi under Grant 2022QFY01-15, in part by the Open Research Projects of Zhejiang Lab under Grant 2021KG0AB01. The work of Tongliang Liu was supported in part by Australian Research Council Project under Grants DE190101473, IC190100031, and DP220102121.
Publisher Copyright:
© 1979-2012 IEEE.
PY - 2023/3/1
Y1 - 2023/3/1
N2 - The noise transition matrix T, reflecting the probabilities that true labels flip into noisy ones, is of vital importance to model label noise and build statistically consistent classifiers. The traditional transition matrix is limited to model closed-set label noise, where noisy training data have true class labels within the noisy label set. It is unfitted to employ such a transition matrix to model open-set label noise, where some true class labels are outside the noisy label set. Therefore, when considering a more realistic situation, i.e., both closed-set and open-set label noises occur, prior works will give unbelievable solutions. Besides, the traditional transition matrix is mostly limited to model instance-independent label noise, which may not perform well in practice. In this paper, we focus on learning with the mixed closed-set and open-set noisy labels. We address the aforementioned issues by extending the traditional transition matrix to be able to model mixed label noise, and further to the cluster-dependent transition matrix to better combat the instance-dependent label noise in real-world applications. We term the proposed transition matrix as the cluster-dependent extended transition matrix. An unbiased estimator (i.e., extended T-estimator) has been designed to estimate the cluster-dependent extended transition matrix by only exploiting the noisy data. Comprehensive experiments validate that our method can better cope with realistic label noise, following its more robust performance than the prior state-of-the-art label-noise learning methods.
AB - The noise transition matrix T, reflecting the probabilities that true labels flip into noisy ones, is of vital importance to model label noise and build statistically consistent classifiers. The traditional transition matrix is limited to model closed-set label noise, where noisy training data have true class labels within the noisy label set. It is unfitted to employ such a transition matrix to model open-set label noise, where some true class labels are outside the noisy label set. Therefore, when considering a more realistic situation, i.e., both closed-set and open-set label noises occur, prior works will give unbelievable solutions. Besides, the traditional transition matrix is mostly limited to model instance-independent label noise, which may not perform well in practice. In this paper, we focus on learning with the mixed closed-set and open-set noisy labels. We address the aforementioned issues by extending the traditional transition matrix to be able to model mixed label noise, and further to the cluster-dependent transition matrix to better combat the instance-dependent label noise in real-world applications. We term the proposed transition matrix as the cluster-dependent extended transition matrix. An unbiased estimator (i.e., extended T-estimator) has been designed to estimate the cluster-dependent extended transition matrix by only exploiting the noisy data. Comprehensive experiments validate that our method can better cope with realistic label noise, following its more robust performance than the prior state-of-the-art label-noise learning methods.
KW - deep clustering
KW - instance-dependent label noise
KW - mixed noisy labels
KW - noise transition matrix
KW - robustness
UR - http://www.scopus.com/inward/record.url?scp=85131762775&partnerID=8YFLogxK
U2 - 10.1109/TPAMI.2022.3180545
DO - 10.1109/TPAMI.2022.3180545
M3 - Journal article
AN - SCOPUS:85131762775
SN - 0162-8828
VL - 45
SP - 3047
EP - 3058
JO - IEEE Transactions on Pattern Analysis and Machine Intelligence
JF - IEEE Transactions on Pattern Analysis and Machine Intelligence
IS - 3
ER -