Monitoring Head/Eye Motion for Driver Alertness with One Camera
We are developing a system for analyzing human driver alertness in mobile and stationary vehicles. It relies on optical flow, combined with color predicates to robustly track a person’s head and facial features. We then determine what kind of activity the driver is performing. We currently classify rotation in all viewing directions, detect eye/mouth occlusion, detect eye blinking, and recover the 3D gaze of the eyes. We show results and discuss how this system can be used for monitoring driver alertness.
Here is the most recent output for some of the data sets. he reason for the difference from the above links is that the above results are stable and the below results are the most recent ideas to make tracking better.
Richard Russo, Mubarak Shah, and Niels Lobo Monitoring Head/Eye Motion for Driver Alertness with One Camera, Fifteenth IEEE International Conference on Pattern Recognition, September 3-8, 2000. Barcelona, Spain.
Power point presentation is here