File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

심재영

Sim, Jae-Young
Visual Information Processing Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Visual tracking using pertinent patch selection and masking

Author(s)
Lee, Dae-YounSim, Jae-YoungKim, Chang-Su
Issued Date
2014-06-27
DOI
10.1109/CVPR.2014.446
URI
https://scholarworks.unist.ac.kr/handle/201301/46609
Fulltext
https://ieeexplore.ieee.org/document/6909841?arnumber=6909841
Citation
IEEE Conference on Computer Vision and Pattern Recognition, pp.3486 - 3493
Abstract
A novel visual tracking algorithm using patch-based appearance models is proposed in this paper. We first divide the bounding box of a target object into multiple patches and then select only pertinent patches, which occur repeatedly near the center of the bounding box, to construct the foreground appearance model. We also divide the input image into non-overlapping blocks, construct a background model at each block location, and integrate these background models for tracking. Using the appearance models, we obtain an accurate foreground probability map. Finally, we estimate the optimal object position by maximizing the likelihood, which is obtained by convolving the foreground probability map with the pertinence mask. Experimental results demonstrate that the proposed algorithm outperforms state-of-the-art tracking algorithms significantly in terms of center position errors and success rates.
Publisher
IEEE Computer Society
ISSN
1063-6919

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.