Class-Dependent Label-Noise Learning with Cycle-Consistency Regularization

De Cheng, Yixiong Ning, Nannan Wang*, Xinbo Gao, Heng Yang, Yuxuan Du, Bo Han, Tongliang Liu

*Corresponding author for this work

Research output: Chapter in book/report/conference proceedingConference proceedingpeer-review

19 Citations (Scopus)

Abstract

In label-noise learning, estimating the transition matrix plays an important role in building statistically consistent classifier. Current state-of-the-art consistent estimator for the transition matrix has been developed under the newly proposed sufficiently scattered assumption, through incorporating the minimum volume constraint of the transition matrix T into label-noise learning. To compute the volume of T, it heavily relies on the estimated noisy class posterior. However, the estimation error of the noisy class posterior could usually be large as deep learning methods tend to easily overfit the noisy labels. Then, directly minimizing the volume of such obtained T could lead the transition matrix to be poorly estimated. How to reduce the side-effects of the inaccurate noisy class posterior remains unsolved. In this paper, we creatively propose to estimate the transition matrix under a forward-backward cycle-consistency regularization, of which we have greatly reduced the dependency of estimating the transition matrix T on the noisy class posterior. Extensive experimental results consistently justify the effectiveness of the proposed method, on reducing the estimation error of the transition matrix and greatly boosting the classification performance.

Original languageEnglish
Title of host publicationNIPS '22: Proceedings of the 36th International Conference on Neural Information Processing Systems
EditorsS. Koyejo, S. Mohamed, A. Agarwal, D. Belgrave, K. Cho, A. Oh
PublisherNeural information processing systems foundation
Pages11104-11116
Number of pages13
ISBN (Print)9781713871088
Publication statusPublished - 28 Nov 2022
Event36th Conference on Neural Information Processing Systems, NeurIPS 2022 - New Orleans Convention Center, New Orleans, United States
Duration: 28 Nov 20229 Dec 2022
https://neurips.cc/Conferences/2022
https://openreview.net/group?id=NeurIPS.cc/2022/Conference
https://proceedings.neurips.cc/paper_files/paper/2022

Publication series

NameAdvances in Neural Information Processing Systems
Volume35
ISSN (Print)1049-5258

Conference

Conference36th Conference on Neural Information Processing Systems, NeurIPS 2022
Country/TerritoryUnited States
CityNew Orleans
Period28/11/229/12/22
Internet address

Scopus Subject Areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Cite this