We’re sorry, something doesn't seem to be working properly.
Please try refreshing the page. If that doesn't work, please contact us so we can address the problem.
Handling Occlusions for Robust Augmented Reality Systems
© Madjid Maidi et al. 2010
- Received: 31 July 2009
- Accepted: 31 March 2010
- Published: 10 May 2010
In Augmented Reality applications, the human perception is enhanced with computer-generated graphics. These graphics must be exactly registered to real objects in the scene and this requires an effective Augmented Reality system to track the user's viewpoint. In this paper, a robust tracking algorithm based on coded fiducials is presented. Square targets are identified and pose parameters are computed using a hybrid approach based on a direct method combined with the Kalman filter. An important factor for providing a robust Augmented Reality system is the correct handling of targets occlusions by real scene elements. To overcome tracking failure due to occlusions, we extend our method using an optical flow approach to track visible points and maintain virtual graphics overlaying when targets are not identified. Our proposed real-time algorithm is tested with different camera viewpoints under various image conditions and shows to be accurate and robust.
- Kalman Filter
- Optical Flow
- Augmented Reality
- Real Scene
- Robust Tracking
To access the full article, please see PDF.
This article is published under license to BioMed Central Ltd. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.