TY - JOUR
T1 - Robust Object Tracking via Key Patch Sparse Representation
AU - He, Zhenyu
AU - Yi, Shuangyan
AU - CHEUNG, Yiu Ming
AU - You, Xinge
AU - Tang, Yuan Yan
N1 - Funding Information:
This work was supported in part by the Shenzhen Research Council under Grant JSGG20150331152017052 and Grant JCYJ20140819154343378, in part by the Faculty Research Grant of Hong Kong Baptist University (HKBU) under Project FRG2/12-13/082, Project FRG1/14-15/041, and Project FRG2/14-15/075, in part by the Knowledge Transfer Office of HKBU under Grant MPCF-005-2014/2015, in part by the National Natural Science Foundation of China under Grant 61272366 and Grant 61272203, in part by the National Science and Technology Research and Development Program under Grant 2015BAK36B00, and in part by the Hubei Province Science and Technology Support Program under Grant 2013BAA120.
PY - 2017/2
Y1 - 2017/2
N2 - Many conventional computer vision object tracking methods are sensitive to partial occlusion and background clutter. This is because the partial occlusion or little background information may exist in the bounding box, which tends to cause the drift. To this end, in this paper, we propose a robust tracker based on key patch sparse representation (KPSR) to reduce the disturbance of partial occlusion or unavoidable background information. Specifically, KPSR first uses patch sparse representations to get the patch score of each patch. Second, KPSR proposes a selection criterion of key patch to judge the patches within the bounding box and select the key patch according to its location and occlusion case. Third, KPSR designs the corresponding contribution factor for the sampled patches to emphasize the contribution of the selected key patches. Comparing the KPSR with eight other contemporary tracking methods on 13 benchmark video data sets, the experimental results show that the KPSR tracker outperforms classical or state-of-the-art tracking methods in the presence of partial occlusion, background clutter, and illumination change.
AB - Many conventional computer vision object tracking methods are sensitive to partial occlusion and background clutter. This is because the partial occlusion or little background information may exist in the bounding box, which tends to cause the drift. To this end, in this paper, we propose a robust tracker based on key patch sparse representation (KPSR) to reduce the disturbance of partial occlusion or unavoidable background information. Specifically, KPSR first uses patch sparse representations to get the patch score of each patch. Second, KPSR proposes a selection criterion of key patch to judge the patches within the bounding box and select the key patch according to its location and occlusion case. Third, KPSR designs the corresponding contribution factor for the sampled patches to emphasize the contribution of the selected key patches. Comparing the KPSR with eight other contemporary tracking methods on 13 benchmark video data sets, the experimental results show that the KPSR tracker outperforms classical or state-of-the-art tracking methods in the presence of partial occlusion, background clutter, and illumination change.
KW - Occlusion prediction scheme
KW - particle filter
KW - patch sparse representation
KW - template update
KW - visual object tracking
UR - http://www.scopus.com/inward/record.url?scp=84960539269&partnerID=8YFLogxK
U2 - 10.1109/TCYB.2016.2514714
DO - 10.1109/TCYB.2016.2514714
M3 - Journal article
C2 - 26978838
AN - SCOPUS:84960539269
SN - 2168-2267
VL - 47
SP - 354
EP - 364
JO - IEEE Transactions on Cybernetics
JF - IEEE Transactions on Cybernetics
IS - 2
M1 - 7432008
ER -