Source Code
Background Modeling
Bayesian Object Detection in Dynamic Scenes
This code performs background modeling and foreground estimation in dynamic scenes captured by static cameras. The algorithm implemented has three innovations over existing approaches. First, the correlation in intensities of spatially proximal pixels is exploited by using a nonparametric density estimation method over a joint domain-range representation of image pixels, multimodal spatial uncertainties and complex dependencies between the domain (location) and range (color). The model of the background is implemented as a single probability density, as opposed to individual, independent, pixel-wise distributions.
Second, temporal persistence is used as a detection criterion. Unlike previous approaches to object detection which detect objects by building adaptive models of the background, the foreground is modeled to augment the detection of objects (without explicit tracking) since objects detected in the preceding frame contain substantial evidence for detection in the current frame.
Finally, the background and foreground models are used competitively in a MAP-MRF decision framework, stressing spatial context as a condition of detecting interesting objects and the posterior function is maximized efficiently by finding the minimum cut of a capacitated graph.
This method is useful for moving object detection in scenes containing dynamic backgrounds, e.g., fountains, fans, and moving trees, etc. The entry point for background modeling is Main.m.
Project Page: Click Here
Yaser Sheikh and Mubarak Shah, Bayesian Modelling of Dyanmic Scenes for Object Detection, IEEE Transactions on PAMI, Vol. 27, Issue 11 (Nov 2005), pp. 1778-1792.