Abstract
Sample selection approaches are popular in robust learning from noisy labels. However, how to control the selection process properly so that deep networks can benefit from the memorization effect is a hard problem. In this paper, motivated by the success of automated machine learning (AutoML), we propose to control the selection process by bi-level optimization. Specifically, we parameterize the selection process by exploiting the general patterns of the memorization effect in the upper-level, and then update these parameters using predicting accuracy obtained from model training in the lower-level. We further introduce semi-supervised learning algorithms to utiilize noisy-labeled data as unlabeled data. To solve the bi-level optimization problem efficiently, we consider more information from the validation curvature by the Newton method and cubic regularization method. We provide convergence analysis for both optimization methods. Results show that while both methods can converge to an (approximately) stationary point, the cubic regularization method can find better local optimal than the Newton method with less time. Experiments on both benchmark and real-world data sets demonstrate that the proposed searching method can lead to significant improvements upon existing methods. Compared with existing AutoML approaches, our method is much more efficient on finding a good selection schedule.
Original language | English |
---|---|
Pages (from-to) | 7833-7849 |
Number of pages | 17 |
Journal | IEEE Transactions on Pattern Analysis and Machine Intelligence |
Volume | 46 |
Issue number | 12 |
Early online date | 29 Apr 2024 |
DOIs | |
Publication status | Published - Dec 2024 |
Scopus Subject Areas
- Software
- Artificial Intelligence
- Applied Mathematics
- Computer Vision and Pattern Recognition
- Computational Theory and Mathematics
User-Defined Keywords
- Automated machine learning (AutoML)
- Deep learning
- Label-noise learning
- Nonconvex optimization