Pyramid Correlation based Deep Hough Voting for Visual Object Tracking

Ying Wang (Beijing Institute of Technology)*; Tingfa Xu (Beijing Institute of Technology); Jianan Li (Beijing Institute of Technology); Shenwang Jiang (Beijing Institute of Technology); Junjie Chen (Beijing Institute of Technology)
PMLR Page

Abstract

Most of the existing Siamese-based trackers treat tracking problem as a combination of a parallel classification and regression problem. However, some studies show that the sibling head structure could lead to suboptimal solutions during the network training. Through experiments we find that, without regression, the performance could be equally promising as long as we delicately design the network to suit the training objective. We introduce a novel voting-based classification-only tracking algorithm named Pyramid Correlation based Deep Hough Voting(short for PCDHV), to jointly locate the top-left and bottom-right corners of the target. We innovatively construct a Pyramid Correlation module to equip the embedded feature with fine-grained local structures and global spatial contexts; The elaborately designed Deep Hough Voting module further take over, integrating long-range dependencies of pixels to perceive corners; In addition, the prevalent discretization gap is simply yet effectively alleviated by increasing the spatial resolution of the feature maps while exploiting channel-space relationships. The algorithm is general, robust and simple. We demonstrate the effectiveness of the module through a series of ablation experiments. Without bells and whistles, our tracker achieves better or comparable performance to the SOTA algorithm on three challenging benchmarks(TrackingNet, GOT-10k and LaSOT) while running at a speed of over 80 FPS. Codes and models will be released.