TY - GEN
T1 - Part-dependent Label Noise: Towards Instance-dependent Label Noise
AU - Xia, Xiaobo
AU - Liu, Tongliang
AU - Han, Bo
AU - Wang, Nannan
AU - Gong, Mingming
AU - Liu, Haifeng
AU - Niu, Gang
AU - Tao, Dacheng
AU - Sugiyama, Masashi
N1 - Funding Information:
TLL was supported by Australian Research Council Project DE-190101473 and DP-180103424. BH was supported by the RGC Early Career Scheme No. 22200720, NSFC Young Scientists Fund No. 62006202, HKBU Tier-1 Start-up Grant, and HKBU CSD Start-up Grant. NNW was supported by National Natural Science Foundation of China under Grant 61922066 and Grant 61876142. DCT was supported by Project FL-170100117, DP-180103424, and IH-180100002. GN and MS were supported by JST AIP Acceleration Research Grant Number JPMJCR20U3, Japan. The authors would give special thanks to Pengqian Lu for helpful discussions and comments. The authors thank the reviewers and the meta-reviewer for their helpful and constructive comments on this work.
Publisher copyright:
© (2020) by individual authors and Neural Information Processing Systems Foundation Inc. All rights reserved.
PY - 2020/12/6
Y1 - 2020/12/6
N2 - Learning with the \textit{instance-dependent} label noise is challenging, because it is hard to model such real-world noise. Note that there are psychological and physiological evidences showing that we humans perceive instances by decomposing them into parts. Annotators are therefore more likely to annotate instances based on the parts rather than the whole instances, where a wrong mapping from parts to classes may cause the instance-dependent label noise. Motivated by this human cognition, in this paper, we approximate the instance-dependent label noise by exploiting \textit{part-dependent} label noise. Specifically, since instances can be approximately reconstructed by a combination of parts, we approximate the instance-dependent \textit{transition matrix} for an instance by a combination of the transition matrices for the parts of the instance. The transition matrices for parts can be learned by exploiting anchor points (i.e., data points that belong to a specific class almost surely). Empirical evaluations on synthetic and real-world datasets demonstrate our method is superior to the state-of-the-art approaches for learning from the instance-dependent label noise.
AB - Learning with the \textit{instance-dependent} label noise is challenging, because it is hard to model such real-world noise. Note that there are psychological and physiological evidences showing that we humans perceive instances by decomposing them into parts. Annotators are therefore more likely to annotate instances based on the parts rather than the whole instances, where a wrong mapping from parts to classes may cause the instance-dependent label noise. Motivated by this human cognition, in this paper, we approximate the instance-dependent label noise by exploiting \textit{part-dependent} label noise. Specifically, since instances can be approximately reconstructed by a combination of parts, we approximate the instance-dependent \textit{transition matrix} for an instance by a combination of the transition matrices for the parts of the instance. The transition matrices for parts can be learned by exploiting anchor points (i.e., data points that belong to a specific class almost surely). Empirical evaluations on synthetic and real-world datasets demonstrate our method is superior to the state-of-the-art approaches for learning from the instance-dependent label noise.
UR - https://www.proceedings.com/59066.html
UR - https://www.scopus.com/inward/record.uri?eid=2-s2.0-85108453165&partnerID=40&md5=4a858a193736bb45e997a4239c7df0
U2 - 10.48550/arXiv.2006.07836
DO - 10.48550/arXiv.2006.07836
M3 - Conference proceeding
SN - 9781713829546
VL - 10
T3 - Advances in Neural Information Processing Systems
SP - 7597
EP - 7610
BT - 34th Conference on Neural Information Processing Systems (NeurIPS 2020)
A2 - Larochelle, H.
A2 - Ranzato, M.
A2 - Hadsell, R.
A2 - Balcan, M.F.
A2 - Lin, H.
PB - Neural Information Processing Systems Foundation
T2 - 34th Conference on Neural Information Processing Systems, NeurIPS 2020
Y2 - 6 December 2020 through 12 December 2020
ER -