Accurate scale estimation of a target is a challenging research problem invisual object tracking. Most state-of-the-art methods employ an exhaustivescale search to estimate the target size. The exhaustive search strategy iscomputationally expensive and struggles when encountered with large scalevariations. This paper investigates the problem of accurate and robust scaleestimation in a tracking-by-detection framework. We propose a novel scaleadaptive tracking approach by learning separate discriminative correlationfilters for translation and scale estimation. The explicit scale filter islearned online using the target appearance sampled at a set of differentscales. Contrary to standard approaches, our method directly learns theappearance change induced by variations in the target scale. Additionally, weinvestigate strategies to reduce the computational cost of our approach. Extensive experiments are performed on the OTB and the VOT2014 datasets.Compared to the standard exhaustive scale search, our approach achieves a gainof 2.5% in average overlap precision on the OTB dataset. Additionally, ourmethod is computationally efficient, operating at a 50% higher frame ratecompared to the exhaustive scale search. Our method obtains the top rank inperformance by outperforming 19 state-of-the-art trackers on OTB and 37state-of-the-art trackers on VOT2014.
translated by 谷歌翻译