Open Access

Models for Gaze Tracking Systems

EURASIP Journal on Image and Video Processing20072007:023570

https://doi.org/10.1155/2007/23570

Received: 2 January 2007

Accepted: 23 August 2007

Published: 30 October 2007

Abstract

One of the most confusing aspects that one meets when introducing oneself into gaze tracking technology is the wide variety, in terms of hardware equipment, of available systems that provide solutions to the same matter, that is, determining the point the subject is looking at. The calibration process permits generally adjusting nonintrusive trackers based on quite different hardware and image features to the subject. The negative aspect of this simple procedure is that it permits the system to work properly but at the expense of a lack of control over the intrinsic behavior of the tracker. The objective of the presented article is to overcome this obstacle to explore more deeply the elements of a video-oculographic system, that is, eye, camera, lighting, and so forth, from a purely mathematical and geometrical point of view. The main contribution is to find out the minimum number of hardware elements and image features that are needed to determine the point the subject is looking at. A model has been constructed based on pupil contour and multiple lighting, and successfully tested with real subjects. On the other hand, theoretical aspects of video-oculographic systems have been thoroughly reviewed in order to build a theoretical basis for further studies.

[12345678910111213141516171819202122232425262728293031]

Authors’ Affiliations

(1)
Electronic and Electrical Engineering Department, Public University of Navarra

References

  1. Jacob R: The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Transactions on Information Systems 1991,9(2):152-169. 10.1145/123078.128728View ArticleGoogle Scholar
  2. Kohzuki K, Nishiki T, Tsubokura A, Ueno M, Harima S, Tsushima K: Man-machine interaction using eye movement. Proceedings of the 8th International Conference on Human-Computer Interaction: Ergonomics and User Interfaces (HCI '99), August 1999, Munich, Germany 1: 407-411.Google Scholar
  3. Teiwes W, Bachofer M, Edwards GW, Marshall S, Schmidt E, Teiwes W: The use of eye tracking for human-computer interaction research and usability testing. Proceedings of the 8th International Conference on Human-Computer Interaction: Ergonomics and User Interfaces (HCI '99), August 1999, Munich, Germany 1: 1119-1122.Google Scholar
  4. Merchant J, Morrissette R, Porterfield JL: Remote measurement of eye direction allowing subject motion over one cubic foot of space. IEEE Transactions on Biomedical Engineering 1974,21(4):309-317. 10.1109/TBME.1974.324318View ArticleGoogle Scholar
  5. LC Technologies : Eyegaze Systems. McLean, Va, USA, http://www.eyegaze.com/
  6. Applied Science Laboratories Bedford, Mass, USA, http://www.a-s-l.com/
  7. Eyetech Digital Systems Mesa, Ariz, USA, http://www.eyetechds.com/
  8. Tobii Technology Stockholm, Sweden, http://www.tobii.se/
  9. Tomono A, Iida M, Kobayashi Y: A TV camera system which extracts feature points for noncontact eye-movements detection. Optics, Illumination and Image Sensing for Machine Vision IV, November 1989, Philadelphia, Pa, USA, Proceedings of SPIE 1194: 2-12.View ArticleGoogle Scholar
  10. Yoo DH, Chung MJ: Non-intrusive eye gaze estimation without knowledge of eye pose. Proceedings of the 6th IEEE International Conference on Automatic Face and Gesture Recognition (FGR '04), May 2004, Seoul, Korea 785-790.Google Scholar
  11. Shih S-W, Liu J: A novel approach to 3-D gaze tracking using stereo cameras. IEEE Transactions on Systems, Man, and Cybernetics, Part B 2004,34(1):234-245. 10.1109/TSMCB.2003.811128View ArticleGoogle Scholar
  12. Zhu Z, Ji Q: Eye gaze tracking under natural head movements. Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR '05), June 2005, Diego, Calif, USA 1: 918-923.Google Scholar
  13. Beymer D, Flickner M: Eye gaze tracking using an active stereo head. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR '03), June 2003, Madison, Wis, USA 2: 451-458.Google Scholar
  14. Brolly XLC, Mulligan JB: Implicit calibration of a remote gaze tracker. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPR '04), June-July 2004, Washington, DC, USA 134.Google Scholar
  15. Ohno T, Mukawa N: A free-head, simple calibration, gaze tracking system that enables gaze-based interaction. Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA '04), March 2004, San Antonio, Tex, USA 115-122.Google Scholar
  16. Hansen DW, Pece AEC: Eye tracking in the wild. Computer Vision & Image Understanding 2005,98(1):155-181. 10.1016/j.cviu.2004.07.013View ArticleGoogle Scholar
  17. Wang J-G, Sung E, Venkateswarlu R: Eye gaze estimation from a single image of one eye. Proceedings of the 9th IEEE International Conference on Computer Vision (ICCV '03), October 2003, Nice, France 1: 136-143.View ArticleGoogle Scholar
  18. Villanueva A: Mathematical models for video oculography, Ph.D. thesis. Public University of Navarra, Pamplona, Spain; 2005.Google Scholar
  19. Hennessey C, Noureddin B, Lawrence P: A single camera eye-gaze tracking system with free head motion. Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA '05), March 2005, San Diego, Calif, USA 87-94.Google Scholar
  20. Guestrin ED, Eizenman M: General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Transactions on Biomedical Engineering 2006,53(6):1124-1133. 10.1109/TBME.2005.863952View ArticleGoogle Scholar
  21. Carpenter RHS: Movements of the Eyes. Pion, London, UK; 1988.Google Scholar
  22. Fry G, Treleaven C, Walsh R, Higgins E, Radde C: Deifinition and measurement of torsion. American Journal of Optometry and Archives of American Academy of Optometry 1947, 24: 329-334.View ArticleGoogle Scholar
  23. Mimica MRM, Morimoto CH: A computer vision framework for eye gaze tracking. Proceedings of XVI Brazilian Symposium on Computer Graphics and Image Processing (SIBGRAPI '03), October 2003, São Carlos, Brazil 406-412.View ArticleGoogle Scholar
  24. Morimoto CH, Mimica MRM: Eye gaze tracking techniques for interactive applications. Computer Vision and Image Understanding 2005,98(1):4-24. 10.1016/j.cviu.2004.07.010View ArticleGoogle Scholar
  25. Hartley RI, Zisserman A: Multiple View Geometry in Computer Vision. 2nd edition. Cambridge University Press, Cambridge, UK; 2004.MATHView ArticleGoogle Scholar
  26. Montesdeoca A: Geometría Proyectiva Cónicas y Cuádricas. Dirección General de Universidades e Investigación. Consejería de Educación Cultura y Deportes, Gobierno de Canarias, Spain; 2001.Google Scholar
  27. Ohno T, Mukawa N, Yoshikawa A: FreeGaze: a gaze tracking system for everyday gaze interaction. Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA '02), March 2002, New Orleans, La, USA 125-132.View ArticleGoogle Scholar
  28. Gonzalez RC, Woods RE: Digital Image Processing. 2nd edition. Prentice-Hall, Upper Saddle River, NJ, USA; 2002.Google Scholar
  29. Boughet J: Camera calibration toolbox for Matlab. 2004.http://www.vision.caltech.edu/bouguetj/calib_doc/Google Scholar
  30. Goñi S, Echeto J, Villanueva A, Cabeza R: Robust algorithm for pupil-glint vector detection in a video-oculography eyetracking system. Proceedings of the 17th International Conference on Pattern Recognition (ICPR '04), August 2004, Cambridge, UK 4: 941-944.View ArticleGoogle Scholar
  31. Boehm W, Prautzsch H: Geometric Concepts for Geometric Design. A K Peters, Wellesley, Mass, USA; 1994.MATHGoogle Scholar

Copyright

© A. Villanueva and R. Cabeza 2007

This article is published under license to BioMed Central Ltd. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.