The goal of this research topic is to bring together different perspectives that address research questions related to human spatial perception, cognition, and behaviour in extended reality (XR). Over the last years, XR technologies such as virtual and mixed reality have been widely used as a tool to enable human interaction with environments and to study human spatial perception, cognition, representation, and behaviour. These topics also intersect in important ways: the interfaces that enable the exploration of XR environments require an appropriate understanding of how humans perceive, represent, and move through space. XR environments offer significant benefits in experimental control, ecological validity, and options to track behaviour and biometric responses. However, experiences enabled by XR technologies, especially VR, are restricted to what perceptual cues can be provided and can be different from the rich sensory experiences available to individuals in their daily life. This raises concerns regarding the potential impact or even interference of XR visualization and interaction techniques with the perception and processing of spatial information. On the other hand, XR technologies are highly flexible and programmable, introducing experimental manipulations that would be difficult, if not impossible, to realize in real-world settings. People in XR environments are not bound by the constraints of physical reality, such as Euclidean geometry, object size, or adherence to the law of gravity. It is critical to investigate design choices that make the experience of being in and interacting with XR environments uniquely different from real-world experiences and to understand how such novel methods may change the traditional notions of spatial cognition.
The ever-increasing quality and affordability of XR technologies enable us to more closely mimic real-world situations or go beyond as desired, opening up unprecedented new opportunities for both research and development. Consequently, this research topic seeks to highlight both the work that is advancing theoretical questions of spatial cognition (in real and XR environments), as well as work that is addressing more applied aspects of spatial perception, cognition, and behaviour in XR.
We welcome submissions of a wide range of article types, including but not limited to original research, reviews, theory, method, and perspective papers on (but not limited to) the following topics:
• Wayfinding and navigation using XR interfaces
• Psychological and environmental factors that influence mental representations of the mediated space
• R&D on using/designing XR systems/hardware/software/interfaces for spatial cognition research
• Designing and evaluating 3DUI and locomotion/navigation interfaces
• Comparison of XR spatial cognitive effects to those in the real world to address transferability
• Spatial perception and representation in real and XR environments
• Spatial orientation and updating in XR
• Spatial learning and memory in XR
• Neural representations of mediated space
• Evaluating spatial cognition and perception in XR using physiological measures (e.g., eye-tracking, neuroimaging)
• Interactive and/or immersive geographic visualizations
The goal of this research topic is to bring together different perspectives that address research questions related to human spatial perception, cognition, and behaviour in extended reality (XR). Over the last years, XR technologies such as virtual and mixed reality have been widely used as a tool to enable human interaction with environments and to study human spatial perception, cognition, representation, and behaviour. These topics also intersect in important ways: the interfaces that enable the exploration of XR environments require an appropriate understanding of how humans perceive, represent, and move through space. XR environments offer significant benefits in experimental control, ecological validity, and options to track behaviour and biometric responses. However, experiences enabled by XR technologies, especially VR, are restricted to what perceptual cues can be provided and can be different from the rich sensory experiences available to individuals in their daily life. This raises concerns regarding the potential impact or even interference of XR visualization and interaction techniques with the perception and processing of spatial information. On the other hand, XR technologies are highly flexible and programmable, introducing experimental manipulations that would be difficult, if not impossible, to realize in real-world settings. People in XR environments are not bound by the constraints of physical reality, such as Euclidean geometry, object size, or adherence to the law of gravity. It is critical to investigate design choices that make the experience of being in and interacting with XR environments uniquely different from real-world experiences and to understand how such novel methods may change the traditional notions of spatial cognition.
The ever-increasing quality and affordability of XR technologies enable us to more closely mimic real-world situations or go beyond as desired, opening up unprecedented new opportunities for both research and development. Consequently, this research topic seeks to highlight both the work that is advancing theoretical questions of spatial cognition (in real and XR environments), as well as work that is addressing more applied aspects of spatial perception, cognition, and behaviour in XR.
We welcome submissions of a wide range of article types, including but not limited to original research, reviews, theory, method, and perspective papers on (but not limited to) the following topics:
• Wayfinding and navigation using XR interfaces
• Psychological and environmental factors that influence mental representations of the mediated space
• R&D on using/designing XR systems/hardware/software/interfaces for spatial cognition research
• Designing and evaluating 3DUI and locomotion/navigation interfaces
• Comparison of XR spatial cognitive effects to those in the real world to address transferability
• Spatial perception and representation in real and XR environments
• Spatial orientation and updating in XR
• Spatial learning and memory in XR
• Neural representations of mediated space
• Evaluating spatial cognition and perception in XR using physiological measures (e.g., eye-tracking, neuroimaging)
• Interactive and/or immersive geographic visualizations