Accessibility navigation

Communicating eye gaze across a distance without rooting participants to the spot

Wolff, R., Roberts, D.J., Murgia, A., Murray, N., Rae, J., Steptoe, W., Steed, A. and Sharkey, P.M. (2008) Communicating eye gaze across a distance without rooting participants to the spot. In: 12th IEEE/ACM International Symposium on Distributed Simulation and Real-Time Applications (DS-RT 2008), Vancouver, Canada,

Full text not archived in this repository.

It is advisable to refer to the publisher's version if you intend to cite from this work. See Guidance on citing.

To link to this item DOI: 10.1109/DS-RT.2008.28


Eye gaze is an important conversational resource that until now could only be supported across a distance if people were rooted to the spot. We introduce EyeCVE, the worldpsilas first tele-presence system that allows people in different physical locations to not only see what each other are doing but follow each otherpsilas eyes, even when walking about. Projected into each space are avatar representations of remote participants, that reproduce not only body, head and hand movements, but also those of the eyes. Spatial and temporal alignment of remote spaces allows the focus of gaze as well as activity and gesture to be used as a resource for non-verbal communication. The temporal challenge met was to reproduce eye movements quick enough and often enough to interpret their focus during a multi-way interaction, along with communicating other verbal and non-verbal language. The spatial challenge met was to maintain communicational eye gaze while allowing free movement of participants within a virtually shared common frame of reference. This paper reports on the technical and especially temporal characteristics of the system.

Item Type:Conference or Workshop Item (Paper)
ID Code:14929
Uncontrolled Keywords:biology computing, eye, virtual reality , EyeCVE, avatar representations, eye gaze, eye movements, nonverbal communication, physical locations, tele-presence system
Additional Information:Nominated for best paper award

University Staff: Request a correction | Centaur Editors: Update this record

Page navigation