首页|Efficient joint model learning, segmentation and model updating for visual tracking

Efficient joint model learning, segmentation and model updating for visual tracking

扫码查看
? 2021 Elsevier LtdThe Tracking-by-segmentation framework is widely used in visual tracking to handle severe appearance change such as deformation and occlusion. Tracking-by-segmentation methods first segment the target object from the background, then use the segmentation result to estimate the target state. In existing methods, target segmentation is formulated as a superpixel labeling problem constrained by a target likelihood constraint, a spatial smoothness constraint and a temporal consistency constraint. The target likelihood is calculated by a discriminative part model trained independently from the superpixel labeling framework and updated online using historical tracking results as pseudo-labels. Due to the lack of spatial and temporal constraints and inaccurate pseudo-labels, the discriminative model is unreliable and may lead to tracking failure. This paper addresses the aforementioned problems by integrating the objective function of model training into the target segmentation optimization framework. Thus, during the optimization process, the discriminative model can be constrained by spatial and temporal constraints and provides more accurate target likelihoods for part labeling, and the results produce more reliable pseudo-labels for model learning. Moreover, we also propose a supervision switch mechanism to detect erroneous pseudo-labels caused by a severe change in data distribution and switch the classifier to a semi-supervised setting in such a case. Evaluation results on OTB2013, OTB2015 and TC-128 benchmarks demonstrate the effectiveness of the proposed tracking algorithm.

Extreme learning machineSemi-supervised learningTracking-by-segmentationVisual tracking

Han W.、Lekamalage C.K.L.、Huang G.-B.

展开 >

School of Electrical and Electronic Engineering Nanyang Technological University

2022

Neural Networks

Neural Networks

EISCI
ISSN:0893-6080
年,卷(期):2022.147
  • 5
  • 46