Robust Visual Tracking Via Consistent Low-Rank Sparse Learning

Robust Visual Tracking Via Consistent Low-Rank Sparse Learning

Tianzhu Zhang, Si Liu, Narendra Ahuja, Ming-Hsuan Yang, Bernard Ghanem
"Robust Visual Tracking Via Consistent Low-Rank Sparse Learning"
International Journal of Computer Vision (IJCV 2014)

Tianzhu Zhang, Si Liu, Narendra Ahuja, Ming-Hsuan Yang, Bernard Ghanem
Visual tracking, Temporal consistency, Sparse representation, Low-rank representation
2014
Object tracking is the process of determining the  states of a target in consecutive video frames based on properties  of motion and appearance consistency. In this paper,  we propose a consistent low-rank sparse tracker (CLRST)  that builds upon the particle filter framework for tracking.  By exploiting temporal consistency, the proposed CLRST  algorithm adaptively prunes and selects candidate particles.  By using linear sparse combinations of dictionary templates,  the proposed method learns the sparse representations of  image regions corresponding to candidate particles jointly by  exploiting the underlying low-rank constraints. In addition,  the proposed CLRST algorithm is computationally attractive  since temporal consistency property helps prune particles and  the low-rank minimization problem for learning joint sparse  representations can be efficiently solved by a sequence of  closed form update operations. We evaluate the proposed  CLRST algorithm against 14 state-of-the-art tracking methods  on a set of 25 challenging image sequences. Experimental  results show that the CLRST algorithm performs favorably  against state-of-the-art tracking methods in terms of accuracy  and execution time.