Accessibility navigation

Eye gaze in virtual environments: evaluating the need and initial work on implementation

Murray, N., Roberts, D., Steed, A., Sharkey, P., Dickerson, P., Rae, J. and Wolff, R. (2009) Eye gaze in virtual environments: evaluating the need and initial work on implementation. Concurrency and Computation-Practice & Experience, 21 (11). pp. 1437-1449. ISSN 1532-0626

Full text not archived in this repository.

It is advisable to refer to the publisher's version if you intend to cite from this work. See Guidance on citing.

To link to this item DOI: 10.1002/cpe.1396


For efficient collaboration between participants, eye gaze is seen as being critical for interaction. Video conferencing either does not attempt to support eye gaze (e.g. AcessGrid) or only approximates it in round table conditions (e.g. life size telepresence). Immersive collaborative virtual environments represent remote participants through avatars that follow their tracked movements. By additionally tracking people's eyes and representing their movement on their avatars, the line of gaze can be faithfully reproduced, as opposed to approximated. This paper presents the results of initial work that tested if the focus of gaze could be more accurately gauged if tracked eye movement was added to that of the head of an avatar observed in an immersive VE. An experiment was conducted to assess the difference between user's abilities to judge what objects an avatar is looking at with only head movements being displayed, while the eyes remained static, and with eye gaze and head movement information being displayed. The results from the experiment show that eye gaze is of vital importance to the subjects correctly identifying what a person is looking at in an immersive virtual environment. This is followed by a description of the work that is now being undertaken following the positive results from the experiment. We discuss the integration of an eye tracker more suitable for immersive mobile use and the software and techniques that were developed to integrate the user's real-world eye movements into calibrated eye gaze in an immersive virtual world. This is to be used in the creation of an immersive collaborative virtual environment supporting eye gaze and its ongoing experiments. Copyright (C) 2009 John Wiley & Sons, Ltd.

Item Type:Article
ID Code:15343
Uncontrolled Keywords:Virtual Environment, eye tracking, evaluation
Additional Information:11th IEEE International Symposium on Distributed Simulation and Real-Time Applications OCT 22-24, 2007 Chania, GREECE

University Staff: Request a correction | Centaur Editors: Update this record

Page navigation