Automated identification of areas of interest in dynamic head-mounted display videos for virtual reality forensic psychology applications
Högväg, J., Andersson, S., Nyman, T. J.
It is advisable to refer to the publisher's version if you intend to cite from this work. See Guidance on citing. Abstract/SummaryThis paper introduces a novel software solution to address the challenges of automated identification of Areas of Interest (AOIs) in dynamic, head mounted display virtual reality (VR) environments, with a focus on applications in forensic psychology. Traditional eye-tracking tools often require manual annotation of AOIs when analyzing moving objects, such as faces, in dynamic 360- degree VR scenarios—a process that is time-intensive. The presented software utilizes RetinaFace to generate consistent AOIs by dynamically tracking facial coordinates across video frames, accounting for variability in head movements. Outputs are seamlessly formatted for analysis in iMotions, enabling robust synchronization and visualization of gaze data. In a pilot study with 16 participants, validation against manual annotation confirmed the system’s high accuracy (92.2%). By automating a previously manual process, the software provides researchers with an efficient and scalable tool for analyzing complex visual attention data, significantly enhancing the feasibility of large-scale VR studies in forensic and psychological research. An empirical example illustrates how the software can be used.
Deposit Details University Staff: Request a correction | Centaur Editors: Update this record |