AUTHOR=Rolff Tim , Steinicke Frank , Frintrop Simone TITLE=Gaze Mapping for Immersive Virtual Environments Based on Image Retrieval JOURNAL=Frontiers in Virtual Reality VOLUME=3 YEAR=2022 URL=https://www.frontiersin.org/journals/virtual-reality/articles/10.3389/frvir.2022.802318 DOI=10.3389/frvir.2022.802318 ISSN=2673-4192 ABSTRACT=
In this paper, we introduce a novel gaze mapping approach for free viewing conditions in dynamic immersive virtual environments (VEs), which projects recorded eye fixation data of users, who viewed the VE from different perspectives, to the current view. This generates eye fixation maps, which can serve as ground truth for training machine learning (ML) models to predict saliency and the user’s gaze in immersive virtual reality (VR) environments. We use a flexible image retrieval approach based on SIFT features, which can also map the gaze under strong viewpoint changes and dynamic changes. A vocabulary tree enables to scale to the large amounts of data with typically several hundred thousand frames and a homography transform re-projects the fixations to the current view. To evaluate our approach, we measure the predictive quality of our eye fixation maps to model the gaze of the current user and compare our maps to computer-generated saliency maps on the