Open Access

Robust, Real-Time 3D Face Tracking from a Monocular View

EURASIP Journal on Image and Video Processing20102010:183605

DOI: 10.1155/2010/183605

Received: 15 March 2010

Accepted: 4 October 2010

Published: 14 October 2010


This paper addresses the problem of 3D face tracking from a monocular view. Dominant tracking algorithms in current literature can be classified as intensity-based or feature-based methods. Intensity-based methods track 3D faces based on the brightness constraint, assuming constant intensity of the face across adjacent frames. Feature-based trackers use local 2D features to determine sparse pairs of corresponding points between two frames and estimate 3D pose from these correspondences. We argue that using either approach alone neglects valuable visual information used in the other method. We therefore propose a novel hybrid tracking approach that integrates multiple visual cues. The hybrid tracker uses a nonlinear optimization framework to incorporate both feature correspondence and brightness constraints, and achieves reliable 3D face tracking in real-time. We conduct a series of experiments to analyze our approach and compare its performance with other state-of-the-art trackers. The experiments consist of synthetic sequences with simulated environmental factors and real-world sequences with estimated ground truth. Results show that the hybrid tracker is superior in both accuracy and robustness, particularly when dealing with challenging conditions such as occlusion and extreme lighting. We close with a description of a real-world human-computer interaction application based on our hybrid tracker.

Publisher note

To access the full article, please see PDF.

Authors’ Affiliations

Department of Computer Science, University of Southern California


© Wei-Kai Liao et al. 2010

This article is published under license to BioMed Central Ltd. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.