Robust Object Tracking via Key Patch Sparse Representation

Zhenyu He, Shuangyan Yi, Yiu Ming CHEUNG, Xinge You, Yuan Yan Tang

Research output: Contribution to journalJournal articlepeer-review

204 Citations (Scopus)


Many conventional computer vision object tracking methods are sensitive to partial occlusion and background clutter. This is because the partial occlusion or little background information may exist in the bounding box, which tends to cause the drift. To this end, in this paper, we propose a robust tracker based on key patch sparse representation (KPSR) to reduce the disturbance of partial occlusion or unavoidable background information. Specifically, KPSR first uses patch sparse representations to get the patch score of each patch. Second, KPSR proposes a selection criterion of key patch to judge the patches within the bounding box and select the key patch according to its location and occlusion case. Third, KPSR designs the corresponding contribution factor for the sampled patches to emphasize the contribution of the selected key patches. Comparing the KPSR with eight other contemporary tracking methods on 13 benchmark video data sets, the experimental results show that the KPSR tracker outperforms classical or state-of-the-art tracking methods in the presence of partial occlusion, background clutter, and illumination change.

Original languageEnglish
Article number7432008
Pages (from-to)354-364
Number of pages11
JournalIEEE Transactions on Cybernetics
Issue number2
Publication statusPublished - Feb 2017

Scopus Subject Areas

  • Software
  • Control and Systems Engineering
  • Information Systems
  • Human-Computer Interaction
  • Computer Science Applications
  • Electrical and Electronic Engineering

User-Defined Keywords

  • Occlusion prediction scheme
  • particle filter
  • patch sparse representation
  • template update
  • visual object tracking


Dive into the research topics of 'Robust Object Tracking via Key Patch Sparse Representation'. Together they form a unique fingerprint.

Cite this