Object Tracking Across Uncalibrated Moving Cameras
Use of multiple cameras provides extended monitoring capabilities. Especially, having mobile cameras increases the flexibility of tracking objects in surveillance scenarios. However, in a multi-camera scenario different cameras will have different physical properties and different views of the objects which makes tracking a challenging task. In this paper, we address these problems and propose a novel approach to perform tracking across multiple moving cameras. Proposed method relaxes the constraints imposed by many other approaches. It does not assume calibrated cameras or planar scenes. Our method is based on the multi-view geometry between the cameras with overlapping fields of views. However, well known epipolar geometry of the static scenes where the cameras are stationary (captured by the fundamental matrix), is not suitable for our task. Thus, we extend the standard epipolar geometry to the geometry of dynamic scenes where the cameras are moving. In this new setting, the fundamental matrix becomes a matrix function. Tracking is then achieved by using the properties of this fundamental matrix function without direct computation of the camera geometry. Tested under a set of experiments, the proposed tracking method shows promising performance.
Alper Yilmaz and Mubarak Shah, Matching actions in presence of camera motion, Computer Vision and Image Understanding, Vol. 104 (2006), pp. 221-231.
Alper Yilmaz and Mubarak Shah, Recognizing Human Actions in Videos Acquired by Uncalibrated Moving Cameras, IEEE ICCV 2005, Beijing, China, October 15-21.