Searching to Exploit Memorization Effect in Deep Learning with Noisy Labels

Hansi Yang, Quanming Yao*, Bo Han, James T. Kwok

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review


Sample selection approaches are popular in robust learning from noisy labels. However, how to control the selection process properly so that deep networks can benefit from the memorization effect is a hard problem. In this paper, motivated by the success of automated machine learning (AutoML), we propose to control the selection process by bi-level optimization. Specifically, we parameterize the selection process by exploiting the general patterns of the memorization effect in the upper-level, and then update these parameters using predicting accuracy obtained from model training in the lower-level. We further introduce semi-supervised learning algorithms to utiilize noisy-labeled data as unlabeled data. To solve the bi-level optimization problem efficiently, we consider more information from the validation curvature by the Newton method and cubic regularization method. We provide convergence analysis for both optimization methods. Results show that while both methods can converge to an (approximately) stationary point, the cubic regularization method can find better local optimal than the Newton method with less time. Experiments on both benchmark and real-world data sets demonstrate that the proposed searching method can lead to significant improvements upon existing methods. Compared with existing AutoML approaches, our method is much more efficient on finding a good selection schedule.
Original languageEnglish
Pages (from-to)1-17
Number of pages17
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
Publication statusE-pub ahead of print - 29 Apr 2024

Scopus Subject Areas

  • Software
  • Artificial Intelligence
  • Applied Mathematics
  • Computer Vision and Pattern Recognition
  • Computational Theory and Mathematics

User-Defined Keywords

  • Deep learning
  • Label-noise learning
  • Automated machine learning (AutoML)
  • Nonconvex optimization


Dive into the research topics of 'Searching to Exploit Memorization Effect in Deep Learning with Noisy Labels'. Together they form a unique fingerprint.

Cite this