Skip to main content

Advertisement

Models for Gaze Tracking Systems

Article metrics

Abstract

One of the most confusing aspects that one meets when introducing oneself into gaze tracking technology is the wide variety, in terms of hardware equipment, of available systems that provide solutions to the same matter, that is, determining the point the subject is looking at. The calibration process permits generally adjusting nonintrusive trackers based on quite different hardware and image features to the subject. The negative aspect of this simple procedure is that it permits the system to work properly but at the expense of a lack of control over the intrinsic behavior of the tracker. The objective of the presented article is to overcome this obstacle to explore more deeply the elements of a video-oculographic system, that is, eye, camera, lighting, and so forth, from a purely mathematical and geometrical point of view. The main contribution is to find out the minimum number of hardware elements and image features that are needed to determine the point the subject is looking at. A model has been constructed based on pupil contour and multiple lighting, and successfully tested with real subjects. On the other hand, theoretical aspects of video-oculographic systems have been thoroughly reviewed in order to build a theoretical basis for further studies.

[12345678910111213141516171819202122232425262728293031]

References

  1. 1.

    Jacob R: The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Transactions on Information Systems 1991,9(2):152-169. 10.1145/123078.128728

  2. 2.

    Kohzuki K, Nishiki T, Tsubokura A, Ueno M, Harima S, Tsushima K: Man-machine interaction using eye movement. Proceedings of the 8th International Conference on Human-Computer Interaction: Ergonomics and User Interfaces (HCI '99), August 1999, Munich, Germany 1: 407-411.

  3. 3.

    Teiwes W, Bachofer M, Edwards GW, Marshall S, Schmidt E, Teiwes W: The use of eye tracking for human-computer interaction research and usability testing. Proceedings of the 8th International Conference on Human-Computer Interaction: Ergonomics and User Interfaces (HCI '99), August 1999, Munich, Germany 1: 1119-1122.

  4. 4.

    Merchant J, Morrissette R, Porterfield JL: Remote measurement of eye direction allowing subject motion over one cubic foot of space. IEEE Transactions on Biomedical Engineering 1974,21(4):309-317. 10.1109/TBME.1974.324318

  5. 5.

    LC Technologies : Eyegaze Systems. McLean, Va, USA, http://www.eyegaze.com/

  6. 6.

    Applied Science Laboratories Bedford, Mass, USA, http://www.a-s-l.com/

  7. 7.

    Eyetech Digital Systems Mesa, Ariz, USA, http://www.eyetechds.com/

  8. 8.

    Tobii Technology Stockholm, Sweden, http://www.tobii.se/

  9. 9.

    Tomono A, Iida M, Kobayashi Y: A TV camera system which extracts feature points for noncontact eye-movements detection. Optics, Illumination and Image Sensing for Machine Vision IV, November 1989, Philadelphia, Pa, USA, Proceedings of SPIE 1194: 2-12.

  10. 10.

    Yoo DH, Chung MJ: Non-intrusive eye gaze estimation without knowledge of eye pose. Proceedings of the 6th IEEE International Conference on Automatic Face and Gesture Recognition (FGR '04), May 2004, Seoul, Korea 785-790.

  11. 11.

    Shih S-W, Liu J: A novel approach to 3-D gaze tracking using stereo cameras. IEEE Transactions on Systems, Man, and Cybernetics, Part B 2004,34(1):234-245. 10.1109/TSMCB.2003.811128

  12. 12.

    Zhu Z, Ji Q: Eye gaze tracking under natural head movements. Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR '05), June 2005, Diego, Calif, USA 1: 918-923.

  13. 13.

    Beymer D, Flickner M: Eye gaze tracking using an active stereo head. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR '03), June 2003, Madison, Wis, USA 2: 451-458.

  14. 14.

    Brolly XLC, Mulligan JB: Implicit calibration of a remote gaze tracker. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPR '04), June-July 2004, Washington, DC, USA 134.

  15. 15.

    Ohno T, Mukawa N: A free-head, simple calibration, gaze tracking system that enables gaze-based interaction. Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA '04), March 2004, San Antonio, Tex, USA 115-122.

  16. 16.

    Hansen DW, Pece AEC: Eye tracking in the wild. Computer Vision & Image Understanding 2005,98(1):155-181. 10.1016/j.cviu.2004.07.013

  17. 17.

    Wang J-G, Sung E, Venkateswarlu R: Eye gaze estimation from a single image of one eye. Proceedings of the 9th IEEE International Conference on Computer Vision (ICCV '03), October 2003, Nice, France 1: 136-143.

  18. 18.

    Villanueva A: Mathematical models for video oculography, Ph.D. thesis. Public University of Navarra, Pamplona, Spain; 2005.

  19. 19.

    Hennessey C, Noureddin B, Lawrence P: A single camera eye-gaze tracking system with free head motion. Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA '05), March 2005, San Diego, Calif, USA 87-94.

  20. 20.

    Guestrin ED, Eizenman M: General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Transactions on Biomedical Engineering 2006,53(6):1124-1133. 10.1109/TBME.2005.863952

  21. 21.

    Carpenter RHS: Movements of the Eyes. Pion, London, UK; 1988.

  22. 22.

    Fry G, Treleaven C, Walsh R, Higgins E, Radde C: Deifinition and measurement of torsion. American Journal of Optometry and Archives of American Academy of Optometry 1947, 24: 329-334.

  23. 23.

    Mimica MRM, Morimoto CH: A computer vision framework for eye gaze tracking. Proceedings of XVI Brazilian Symposium on Computer Graphics and Image Processing (SIBGRAPI '03), October 2003, São Carlos, Brazil 406-412.

  24. 24.

    Morimoto CH, Mimica MRM: Eye gaze tracking techniques for interactive applications. Computer Vision and Image Understanding 2005,98(1):4-24. 10.1016/j.cviu.2004.07.010

  25. 25.

    Hartley RI, Zisserman A: Multiple View Geometry in Computer Vision. 2nd edition. Cambridge University Press, Cambridge, UK; 2004.

  26. 26.

    Montesdeoca A: Geometría Proyectiva Cónicas y Cuádricas. Dirección General de Universidades e Investigación. Consejería de Educación Cultura y Deportes, Gobierno de Canarias, Spain; 2001.

  27. 27.

    Ohno T, Mukawa N, Yoshikawa A: FreeGaze: a gaze tracking system for everyday gaze interaction. Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA '02), March 2002, New Orleans, La, USA 125-132.

  28. 28.

    Gonzalez RC, Woods RE: Digital Image Processing. 2nd edition. Prentice-Hall, Upper Saddle River, NJ, USA; 2002.

  29. 29.

    Boughet J: Camera calibration toolbox for Matlab. 2004.http://www.vision.caltech.edu/bouguetj/calib_doc/

  30. 30.

    Goñi S, Echeto J, Villanueva A, Cabeza R: Robust algorithm for pupil-glint vector detection in a video-oculography eyetracking system. Proceedings of the 17th International Conference on Pattern Recognition (ICPR '04), August 2004, Cambridge, UK 4: 941-944.

  31. 31.

    Boehm W, Prautzsch H: Geometric Concepts for Geometric Design. A K Peters, Wellesley, Mass, USA; 1994.

Download references

Author information

Correspondence to Arantxa Villanueva.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 2.0 International License (https://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and Permissions

About this article

Keywords

  • Pattern Recognition
  • Expense
  • Computer Vision
  • Image Feature
  • Theoretical Basis