Detection and Tracking of Moving Objects from Overlapping EO and IR Sensors

Jinman Kang


Abstract

We present an approach for tracking moving objects observed by EO and IR sensors on a moving platform. Our approach detects and tracks the moving objects after accurately recovering the geometric relationship between different sensors. We address the tracking problem by separately modeling the appearance and motion of the moving regions using stochastic models. The appearance of the detected blobs is described by multiple spatial distribution models of blobs' colors and edges from different sensors. This representation is invariant to 2D rigid and scale transformation. It provides a rich description of the object being tracked and produces an accurate blob similarity measure for tracking - especially when one of sensors fails to provide reliable information. The motion model is obtained using a Kalman Filter (KF) process, which predicts the position of the moving objects while taking into account the camera motion. Tracking is performed by the maximization of a joint probability model reflecting appearance and motion. The novelty of our approach consists in defining a Joint Probability Data Association Filter (JPDAF) for integrating multiple cues from multiple sensors, It provides an unified framework for fusing information from different types of sensors. The proposed method tracks multiple moving objects with partial and total occlusions under various illumination conditions. We demonstrate the performance of the system on several real video surveillance sequences.


Maintained by Philippos Mordohai