Skip to main content

REVIEW article

Front. Syst. Neurosci., 17 November 2022
This article is part of the Research Topic Naturalistic Neuroscience – Towards a Full Cycle from Lab to Field View all 14 articles

Naturalistic neuroscience and virtual reality

  • 1Faculty of Biology, Ludwig-Maximilians-Universität München, Munich, Germany
  • 2Bernstein Center for Computational Neuroscience Munich, Munich, Germany

Virtual reality (VR) is one of the techniques that became particularly popular in neuroscience over the past few decades. VR experiments feature a closed-loop between sensory stimulation and behavior. Participants interact with the stimuli and not just passively perceive them. Several senses can be stimulated at once, large-scale environments can be simulated as well as social interactions. All of this makes VR experiences more natural than those in traditional lab paradigms. Compared to the situation in field research, a VR simulation is highly controllable and reproducible, as required of a laboratory technique used in the search for neural correlates of perception and behavior. VR is therefore considered a middle ground between ecological validity and experimental control. In this review, I explore the potential of VR in eliciting naturalistic perception and behavior in humans and non-human animals. In this context, I give an overview of recent virtual reality approaches used in neuroscientific research.

1. Introduction

The arguably most important feature of natural behavior is active exploration and interrogation of the environment (Gottlieb and Oudeyer, 2018). External stimuli are not passively perceived. What is paid attention to is selected and specifically probed, reflecting the animals' motivations and needs. Moreover, natural environmental features and sensory cues are dynamic, multimodal, and complex (Sonkusare et al., 2019). This is in stark contrast to laboratory settings, which are characterized by numerous repetitions of the same imposed stimuli. These stimuli are often directed to only a single sense under simplified, artificial conditions and are disconnected from the animal's responses. Repetitions are important for behavioral modeling and the search for neural correlates and mechanisms, which both rely on trial-based averaging. It is nevertheless not surprising that results from laboratory experiments are of limited ecological validity and may not reveal the neural mechanisms underlying natural behavior (Krakauer et al., 2017; Dennis et al., 2021). Virtual reality (VR) may be part of a solution to this problem.

With VR, an artificial environment is simulated in which the user's actions determine the sensory stimulation, closing the loop between stimulation, perception, and action. A major motivation for the application of VR in neurophysiology is the desire to test behavior while recording with apparatuses that cannot be easily carried by the test subject or that require stability that cannot be achieved during free movement. The potential of VR, however, lies beyond this simple wish to fixate a behaving subject in place.

In connection with scientific VR-use, terms like ecological and ethological validity as well as naturalistic conditions are frequently voiced. In this regard, VR is considered to stand above traditional laboratory methods while maintaining a similar level of experimental control (e.g., Bohil et al., 2011; Parsons, 2015; Minderer et al., 2016; Krakauer et al., 2017; Lenormand and Piolino, 2022).

In the present article, I explore the potential of VR for evoking naturalistic perception and behavior and to promote the understanding of underlying brain function. I will give an overview of current VR technologies applied in neuroscience and use cases across different species, motivate why they are used and evaluate them in view of naturalistic neuroscience. To begin, let us briefly address the question: what is VR?

2. What is VR?

In his book, LaValle (2020) defines VR as: “Inducing targeted behavior in an organism by using artificial sensory stimulation, while the organism has little or no awareness of the interference.” This definition seems quite broad, but is flexible enough to embrace a variety of approaches, including those relevant to the present article. In a neuroscientific VR experiment, the participant experiences the stimulation of one or more senses to create the illusion of a “reality” that is intended by the researcher. Limited awareness seems less crucial for neuroscientific VR applications. However, one can argue that limited awareness is important for the feeling of presence in the artificial world, which as a result is treated as being natural—a basis for ecological validity.

What is missing from the above definition is that the virtual world is updated based on the user's behavior, providing an interactive experience (Bohil et al., 2011; Dombeck and Reiser, 2011; Naik et al., 2020). In terms of VR application in neuroscience, this narrowing of the definition is important because it distinguishes VR from simple sensory stimulation. The update is done in real time so that a closed loop is achieved between stimulation and behavior. For a real-time experience the update cycle needs to be sufficiently fast; how fast depends on the perceptual capabilities of the animal species and the sensory-motor system under investigation. Update delays can be increased parametrically depending on the research question. The most extreme case is the open loop, where the stimulation and the participant's actions are independent. Open loop corresponds to conventional stimulus conditions typically used in neuroscience studies.

3. Why using VR? And why for naturalistic neuroscience?

Typical motivations for using VR revolve around three different aspects: (1) multimodal stimulation with flexible and precise control, (2) interactivity instead of purely passive perception, and (3) the application of neural recording techniques that require particular mechanical stability. For naturalistic approaches, the first two points are the most important, but in a neuroscience context, the last is also relevant. I therefore discuss these three motivations next in view of their utility for naturalistic paradigms. Stimulus control and closed-loop methods have been steadily refined throughout the history of VR. For an account with regard to animal VRs and specifically rodent VRs used in research, the interested reader may be referred to Thurley and Ayaz (2017) and Naik et al. (2020). A general history of VR can be found in LaValle (2020). Finally, in this section I address the issue of immersion, i.e., the ability of a VR to draw the user in so that they feel present in it, which is closely related to achieving naturalistic conditions with VR.

3.1. VR provides flexible stimulus control

As a laboratory technique, VR benefits from the ability to perform experiments under precise control. This is what lets VR induce targeted behavior. Confounding and unintended influences although not completely excluded can be substantially reduced. VR is inherently flexible. It provides control over the complexity of the environment such as its size or the positioning of landmarks. Space restrictions, which can be a problem in the laboratory, do not exist in VR. Features can be easily and quickly altered without the participant noticing. One may, e.g., add or remove certain cues and test their contribution to a neural activity or a behavior. The manipulations can be done systematically and without influencing other components of the environment (Powell and Rosenthal, 2017). Also, stimuli may be provided that are unavailable in nature – although this speaks against the naturalistic use focused at in the present article. All of the above is hard to achieve in the field, where it is often less obvious which cues are attended to and which information is leveraged and which not; think, for instance, of investigating spatial navigation of primates in their natural habitat (De Lillo et al., 2014; Dolins et al., 2014, 2017; Allritz et al., 2022).

In the following, I will discuss paradigms of VR stimulus control utilized in specific areas of neuroscience research.

3.1.1. Spatial cognition and navigation

The most obvious neuroscience use of VR is in the study of spatial perception and navigation (Bohil et al., 2011; Thurley and Ayaz, 2017). However, the advantage of VR for this purpose has been questioned since locomotion in traditional real world laboratory paradigms like foraging for food on linear tracks and in open field boxes is more natural than on a treadmill or alike (Minderer et al., 2016). In real arenas, information from the external world, e.g., from visual cues, and internally generated information, e.g., from moving body parts, are coordinated. In VR, these sources may not be aligned due to problems integrating simulation and tracking.

Such conflicts are likely the reason for the altered responses of space-encoding neurons in the rodent brain found in VR compared to real-world experiments. Head-fixed or body-fixed rodents do not receive normal vestibular input, resulting in mismatches between vestibular and visual information. Place cells in the hippocampus show altered position coding under such conditions, as confirmed by direct comparisons between virtual paradigms and their real-world counterparts (Chen et al., 2013; Ravassard et al., 2013; Aghajan et al., 2014). In VR setups that do not restrict body rotations and in which vestibular information about rotational movements is available to the animal, normal place-selective firing has been reported (Aronov and Tank, 2014; Chen et al., 2018; Haas et al., 2019). Freely-moving VRs may even better solve this problem (Del Grosso et al., 2017; Kaupert et al., 2017; Stowers et al., 2017; Madhav et al., 2022).

Thus, the design of a VR setup and the quality of a VR simulation may elicit atypical neural responses. These issues do not devalue certain VR systems—each setup may provide informative insights—but they are important indications that the suitability for understanding natural behavior and associated neural activity may be limited for some VR applications.

However, VR has advantages over traditional laboratory paradigms, which are themselves far from the situation animals face in the wild. These advantages depend on the research question. The possibility to simulate environments that are much larger than the available space in the laboratory, can increase ecological validity (Dolins et al., 2017), e.g., for the study of spatial learning in macaques (Taillade et al., 2019) and chimpanzees (Allritz et al., 2022), and fly search behavior (Kaushik et al., 2020). Specialized VR systems and paradigms can provide insights into specific topics, (e.g., path integration Petzschner and Glasauer, 2011; Lakshminarasimhan et al., 2018, 2020; Thurley and Schild, 2018; Jayakumar et al., 2019; Robinson and Wiener, 2021; Madhav et al., 2022). VR enables task standardization for cross-species comparison. For instance, spatial behavior can be tested with humans in typical rodent laboratory mazes like the Morris water maze (Laczó et al., 2010). Moreover, VR helps to overcome difficulties of testing spatial behavior and cognition in the wild as I already pointed out above.

3.1.2. (Multi-)sensory processing

In VR, several senses can be stimulated at once and in concert. Such a multimodal stimulation increases immersion and engagement. The experience will be more ecological and if natural stimuli are used more naturalistic. Already the first applications of VR, e.g., for studying sensory-motor control of flying in insects, combined visual, mechanosensory (wind source), and olfactory cues (Gray et al., 2002). In general, any VR method that connects locomotion with some type of sensory stimulation provides a multimodal experience because it inevitably encompasses sensory feedback about self-motion. In this sense, the most typical VR that uses visual stimulation with walking on a treadmill or tethered flying will always be multimodal.

In principle, unnatural stimuli may be given or the stimulation of different senses may be mismatched, allowing for experiments that are not possible in the real world. Of course, this speaks against the naturalistic principle, but let me nevertheless give a few examples for illustration. VR makes it possible to decouple stimuli that are inextricably linked in the real world. In rodent experiments, visual sensory have been dissociated from non-visual self-motion inputs to probe their differential influences on spatial responses in the hippocampal formation (Chen et al., 2013; Tennant et al., 2018; Haas et al., 2019; Jayakumar et al., 2019) or on running speed responses in visual cortex (Saleem et al., 2013). In flies, the feedback from eyes and halteres has been decoupled in simulated flight setups (Sherman and Dickinson, 2003). Also the lag between an action and the subsequent update of the virtual stimulation could be changed. For instance, positional changes could be delayed or made jump-/teleportation-like (see Domnisoru et al., 2013; Kaupert et al., 2017; Stowers et al., 2017; Tennant et al., 2018, for examples from experiments in fish and rodents). Thus, in general, sensory and motor variables can be separated in VR.

3.1.3. Social interactions

The issues of laboratory vs. field work also apply to the study of social interactions. VR can alleviate some of them. Virtual stimuli can be designed to appear more similar to real-life counterparts than stimuli used in classical ethological experiments (Naik et al., 2020). A particular advantage is that stimuli can be animated. Even under open-loop conditions, moving prey can be simulated to study prey-capture (Ioannou et al., 2012) or conspecifics to probe mate-choice (Gierszewski et al., 2017). Further examples can be found in Naik et al. (2020).

An important point for experiments on social interactions is consistency (Powell and Rosenthal, 2017). No matter if in the wild or the laboratory, the behavior of real subjects depends on their motivation and will change by their interaction with others. Simulated subjects do not change their behavior in this manner (Chouinard-Thuly et al., 2017). Alike other VR stimuli, socially-relevant ones can also be precisely controlled, held constant or adapted, presented several times and to different subjects. Importantly, not just the static appearance is under close control but also the simulated movement patterns. In general, interaction with moving objects may sometimes be easier simulated in VR than provided in the real world, cf. e.g., the prey-capture study mentioned above (Ioannou et al., 2012).

3.2. VR goes beyond mere stimulus delivery

With VR the loop between perception and action can be closed. The participant in a VR experiment not only passively perceives the stimulation but behaves and interacts with it, which in turn changes the stimulus environment. This active engagement makes the VR experience much more reminiscent of real life and natural conditions than traditional, passive approaches.

The importance of motor actions for perception has been demonstrated, for instance, by experiments with mice moving on a treadmill while perceiving visual stimuli. Even when the treadmill is not coupled to the stimulation in such experiments, i.e., open-loop, neuronal responses in the visual cortex are substantially modulated (Niell and Stryker, 2010; Ayaz et al., 2013). More recently impacts of movement have been described not only for vision (Dadarlat and Stryker, 2017; Clancy et al., 2019) but also for audition and somatosensation (Fu et al., 2014; Schneider and Mooney, 2018). Similar dependence of sensory processing on behavioral state has also been reported in insects (Maimon et al., 2010). In zebrafish, the interaction between motor responses and visual feedback (Portugues and Engert, 2011) and related neural processing (Ahrens et al., 2012) has been investigated with closed-loop experiments as well as visually-driven swim patterns underlying natural prey capture (Trivedi and Bollmann, 2013).

Closed-loop is also helpful for decision making studies in species such as mice (Harvey et al., 2012), gerbils (Kautzky and Thurley, 2016), and zebrafish (Bahl and Engert, 2020; Dragomir et al., 2020). These studies use rather abstract visual stimuli, like random dots and stripe patterns, which are admittedly not very naturalistic. However, enabling natural motor responses, like walking and swimming, are key improvements over conventional designs that rely on nose-poking or lever-pressing.

3.3. VR enables recording of brain activity with bulky devices

One of the early motivations of using VR in neuroscience was that neural recording techniques can be used that require a high degree of mechanical stability or are too bulky and heavy to be carried by the animal (Dombeck et al., 2010; Harvey et al., 2012; Ahrens et al., 2013; Domnisoru et al., 2013; Schmidt-Hieber and Häusser, 2013; Leinweber et al., 2014). Similar reasons apply to the use of VR with fMRI or other methods for recoding human brain activity (Lenormand and Piolino, 2022). The application of VR with a focus on recording brain activity in the behaving animal has, e.g., been reviewed by Dombeck and Reiser (2011). Technological progress will increasingly weaken this motivation to use VR in the future. Miniature head-mounted systems for imaging (Yu et al., 2015) and single cell recordings (Valero and English, 2019) are under constant development and can be applied in freely moving animals.

3.4. Achieving immersion and presence in VR

Important concepts that are frequently expressed, especially in the context of human VR, are those of immersion and presence. How immersive a VR is, i.e., how strongly it draws the user in, is determined by the degree of sensory stimulation and the sensitivity to motor actions of the VR system in use. Deeper immersion leads to increased presence, i.e., the feeling of being in the virtual world (Bohil et al., 2011). For recreational or therapeutic applications with humans, high levels of immersion are surely desired and necessary. But how about scientific use?

Immersion does not seem to be the most important factor for investigating certain research questions. A VR setup could in principle only be a tool to provide some sensory stimulation and to connect it to behaviors. However, the ultimate goal of neuroscience is to investigate behavioral and brain responses that occur under natural conditions (Krakauer et al., 2017). A VR approach could only contribute to this goal if it elicits such responses. Yet, a VR that evokes responses as in real life implies deeper immersion. Thus, ecological validity and immersion are linked.

But how to determine presence and immersion? Humans can be questioned (Hofer et al., 2020), but what about animals? To determine and quantify the degree of immersion, two different types of responses seem at hand: neural and behavioral. However, not all possible types of neural activity must occur in natural behaviors, so only behavioral responses are suitable to determine proximity to natural conditions (Krakauer et al., 2017). Therefore, sufficient understanding of the behavior to be elicited in VR is required under real-world conditions.

A number of studies compared physiological and psychological reactions between real-life situations and their virtual counterparts in humans (examples are reviewed by Lenormand and Piolino, 2022). Behavioral conformity between virtual and real world is less regularly assessed with animals (Powell and Rosenthal, 2017). While it is more common in insects (see Dahmen et al., 2017 for an elegant example with ant spatial navigation) and spiders (Peckmezian and Taylor, 2015), work with rodents on this topic is scarce (Hölscher et al., 2005). Comparisons have instead been made of neural activity between real-world and virtual conditions (e.g., for space-responses in rodent hippocampus Chen et al., 2013; Ravassard et al., 2013; Safaryan and Mehta, 2021).

Immersion is often considered in the context of the quality of the visual stimulation as it is the dominant sensory modality in primates and flying insects. For other species, however, vision is not as dominant. In these animals, immersion and ecological validity will depend more strongly on other types of perceptions, e.g., sound, touch, smell. As an example that may not seem like VR at first glance but is nonetheless consistent with the broad definition of VR favored in this article and which is ecologically valid see Faumont et al. (2011). In this study, osmo-sensitive neurons of the nematode Caenorhabditis elegans were optogenetically activated to simulate an aversive location in the animal's environment.

4. Technical components for naturalistic VR

Key technical components of VR are devices that provide sensory stimulation to create the virtual experience and those that keep track of the behavioral responses. How these components may promote naturalistic stimulation and behavior is discussed next.

4.1. Tracking movements and actions

In VR setups, participants are often restrained so that they can sense the stimuli appropriately while having enough freedom to move in the virtual environment. For instance, a specific position may need to be maintained in relation to a screen for visual stimulation or speakers for auditory stimulation. Other reasons are requirements on mechanical stability of neural recording devices as was already discussed above.

The type of fixation depends on the tested species. Flying insects may be tethered with their body leaving the wings free to beat (Gray et al., 2002; Sherman and Dickinson, 2003; Dombeck and Reiser, 2011). Wing motion is monitored with an optical sensor, and the difference between the amplitudes of left and right wing beats serves as an indicator of attempted body rotations (Reiser and Dickinson, 2008). In legged animals, fixation on a treadmill is the standard technique (Carrel, 1972; Dahmen, 1980; Seelig et al., 2010; Takalo et al., 2012; Peckmezian and Taylor, 2015; Thurley and Ayaz, 2017; Haberkern et al., 2019; Naik et al., 2020). Such treadmills are typically styrofoam balls on an air-cushion, cylindrical treadmills or linear belts. The animals move the treadmill with their legs, which is captured and used to update the position in the virtual world. In animals like rodents, which have a natural need for walking (Meijer and Robbers, 2014), a treadmill gives a more natural way of responding to the animals—even in non-spatial tasks (Garbers et al., 2015; Kautzky and Thurley, 2016; Henke et al., 2021, 2022). To provide a realistic, natural feeling of motion, the physical properties of the treadmill, such as its moment of inertia, must be taken into account and adapted to the animal species. For instance, treadmills for ants have particularly low friction and weight (Dahmen et al., 2017).

Any type of fixation imposes unnatural movements and disrupts sensory feedback about motor behavior. Tethered insects do not receive normal input from their balance organs (Fry et al., 2008). Head-fixed rodents do not receive natural input about rotations and linear acceleration from their vestibular organs. Also they have to make unnatural shear movements with their legs on the treadmill to make rotations in the virtual environment (Thurley and Ayaz, 2017). A similar lack of vestibular input is also found in zebrafish VRs, in which the animals' heads are immobilized (e.g., Portugues and Engert, 2011). A solution to this problem is offered by VR setups for freely flying, walking, and swimming animals (Fry et al., 2008; Del Grosso et al., 2017; Stowers et al., 2017; Ferreiro et al., 2020; Madhav et al., 2022). These setups use cameras to track the position of the animal (or only its head) and update a perspective-correct visual scenery. Alternatively, tracking information can be used to drive a motorized treadmill that compensates for the animal's movements to hold it in place with respect to the VR hardware (Kaupert et al., 2017).

Several technical considerations apply to ensure proper tracking, especially to meet the needs of the experimental animal (see Naik et al., 2020). A number of different tracking methods exist based on deep learning and other machine learning techniques (e.g., Hedrick, 2008; Robie et al., 2017; Graving et al., 2019; Mathis and Mathis, 2020; Vagvolgyi et al., 2022).

Body fixation is also not required when the stimulus display is directly attached to the sense organ and can be carried as with head-mounted displays. In VR headsets, head-mounted displays are combined with head-tracking hardware (Bohil et al., 2011; LaValle, 2020). Headsets prevail in human VR nowadays but also other tracking methods exist like treadmills for humans (examples are found in LaValle, 2020). In humans and other primates, often joysticks, game pads or keyboards are used to track motion and other responses (Washburn and Astur, 2003; Sato et al., 2004), for instance, when particular fixation is necessary like in fMRI (Lenormand and Piolino, 2022).

4.2. Displaying visual stimuli

Visual virtual worlds are the predominant type of VR. They are almost exclusively provided in first-person view, i.e., from the point of view of the participant. Compared to a third person perspective behind a visible avatar—which might be possible with humans but is hard to imagine with animals—the first-person view enhances the experience (Dolins et al., 2017). For presentation, different types of displays are used, such as simple monitors, panoramic projection screens and head-mounted displays. Projections to the floor below the animals are also leveraged, e.g., with zebrafish (Ahrens et al., 2012; Bahl and Engert, 2020; Dragomir et al., 2020). In insects with their lower visual acuity but fast reaction times, LED displays are used (Dombeck and Reiser, 2011). For animals with eyes in the front, like primates and carnivorans, flat monitors may be sufficient. For animals with laterally positioned eyes, like rodents, only wide displays cover a sufficient part of the field of view (Dolins et al., 2017; Thurley and Ayaz, 2017). For ecological validity, the projection needs to have correct perspective and be undistorted (Dolins et al., 2017; Naik et al., 2020).

In general, it has to be kept in mind that images shown on displays are perceived differently by different animals (Chouinard-Thuly et al., 2017; Naik et al., 2020). Photoreceptor sensitivities differ across species (e.g., Osorio and Vorobyev, 2005) and the color display has to be adapted to the species' specifics to enable naturalistic stimulation. Behavioral methods can also readout animals' sensitivities (Knorr et al., 2018). Other visual capabilities like integration times and acuity also vary between species and need to be accommodated. For a detailed discussion with a focus on technical challenges see Naik et al. (2020). Similar considerations obviously apply to other sensory systems as well and have to be taken into account, especially when a naturalistic perceptual experience is intended.

Images on a screen remain 2D and natural vision is only partially achieved (Dolins et al., 2017). For instance, stereopsis is not possible with single images. Head-mounted displays in humans solve this by presenting offset images to each eye (LaValle, 2020). For an approach with insects, see Nityananda et al. (2016). Currently, there are no VR headsets for animals, although they may be in development, as they are mainly a miniaturization issue, apart from species-specific needs. Technology in this direction includes head-mounted camera systems to track eye movements (Meyer et al., 2018) and inertial sensors for head-tracking in rodents (Venkatraman et al., 2010; Fayat et al., 2021).

4.3. Sound stimulation

To simulate 3D spatial sound scenes that mimic real-life situations, virtual acoustic approaches have been developed. Human VR headsets often include headphones to provide sound stimuli in conjunction with the visual display. Alternatively, in free-field auralization, arrays of loudspeakers are placed around the user, such that sound sources can be precisely positioned in virtual space (Seeber et al., 2010). Compared to headphones, the user can listen with their own ears and the characteristics of their ears can be captured. Therefore, experiments can be also done with hearing aid wearers. A disadvantage is that the setup has to be placed in an anechoic chamber, which is demanding and expensive to construct. For correct deliverance of sound cues, the user has to be placed in a specific location with respect to the array. With such auditory VR setups, e.g., auditory motion parallax could be demonstrated in humans (Genzel et al., 2018). In rodents, virtual acoustics is done with loudspeakers placed around the treadmill (Cushman et al., 2013; Funamizu et al., 2016). Other approaches use more of an augmentation of a real arena than virtual acoustics to probe spatial localization of objects with the help of acoustic stimulation (Ferreiro et al., 2020; Amaro et al., 2021).

4.4. Tactile and haptic stimulation

In VR tactile and haptic stimulation can also be provided, simulating surfaces with different textures or the feel of forces (Bohil et al., 2011). Haptic systems for humans consist, e.g., of robotic arms with which force or pressure can be applied or pin arrays can be used to simulate surfaces (Culbertson et al., 2018; Wang et al., 2019). In tactile VR systems for rodents, the animals move through corridors simulated by movable plates (Sofroniew et al., 2014) or rotating cylinders with different textures (Ayaz et al., 2019). These “walls” are touched by the animals with their whiskers and they are adapted in closed-loop by the movements of the animal. Similar setups exist in which the animals are freely moving and that are not actually VR but still allow for simulating different tactile textures (Kerekes et al., 2017). Belt treadmills can also be equipped with tactile cues (Geiller et al., 2017).

4.5. Odors

Recently, devices have been developed to quickly and precisely deliver odorants with sufficient diffusion and clearance times for simulating spatially confined olfactory cues. Examples for use with humans are Salminen et al. (2018) and Micaroni et al. (2019). In animal studies, olfactory VR has been used with tethered rodents (Radvansky and Dombeck, 2018; Fischler-Ruiz et al., 2021; Radvansky et al., 2021) and insects (Gray et al., 2002). Precise odor delivery poses a problem for freely moving VRs, either a distribution system has to be carried on the body or, alternatively, odors could be delivered on room scale (Fry et al., 2008). However, the latter is hard to control in terms of odor concentration and distribution, preventing proper localization. Systems for humans that simulate taste are under development (Narumi et al., 2011; Vi et al., 2017; Kerruish, 2019) but have not yet been used in neuroscience as far as I know.

4.6. Rotation and gravity

VR setups that require fixation of the animals typically suffer from providing only inadequate information about rotational and linear acceleration cues. To overcome such problems, motion platforms with multiple degrees of freedom or rotating chairs providing horizontal rotations have been used for vestibular stimulation (Gu et al., 2010; Dokka et al., 2011; Genzel et al., 2016; Garzorz and MacNeilage, 2017). Similarly, rotational gimbals have been used with flies (Sherman and Dickinson, 2003). VR setups that allow for free movement do not suffer from these problems.

5. Limitations and potentials for naturalistic VR

Technical considerations for VRs with respect to species specifics and naturalistic experiments have already been discussed above. Here I address some more general issues.

5.1. Not everything can be tested in VR in terms of naturalistic experiments

There is in principle no limitation on what can be simulated with VR. For the purpose of the present article the simulation just needs to be naturalistic. We can intuitively judge how a VR simulation affects a human participant—or often simply take it for granted that we can—but this is impossible with animals. Thus, as I have argued above, naturalistic approaches must ensure that a VR simulation elicits the same behaviors that would occur in the real world counterpart (Krakauer et al., 2017; Powell and Rosenthal, 2017). This strongly constrains what can and cannot be done with VR in terms of naturalistic experiments. When a strict comparison between the real world and VR is not possible, such as with the teleportation-like position changes mentioned above, it means that the experiment is not suitable for a naturalistic VR study. Other questions may be better investigated directly in the real world, instead of investing in building a VR with all its limitations.

5.2. How natural can VR become and how natural or real does it have to be?

As pointed out by LaValle (2020), it is tempting to try to match the physical world in VR as closely as possible (universal simulation principle). Such a goal is inappropriate, since a simulation will never be perfect and always comprise unanticipated confounding variables. One should rather be guided by the research objective when designing the VR. A sensible design can at times mean reduction and simplification, without losing ecological validity (Bucci-Mansilla et al., 2021). Related to this is the uncanny valley phenomenon, in which high realism of an artificial stimulus makes observers feel uneasy (Chouinard-Thuly et al., 2017; LaValle, 2020). Among non-human animals this problem has been described with macaques (Steckenfinger and Ghazanfar, 2009).

5.3. VR sickness and fatigue

A regularly encountered problem with human VR applications is that of cyber, simulator, or VR sickness (Bohil et al., 2011; LaValle, 2020). Some participants experience discomfort and nausea due to latencies in the synchronization of the VR components, which results in incongruent sensory inputs. Of particular importance here is vestibular feedback from self-motion, which does not match visual input. This problem may occur due to improper tracking but also a misunderstanding and disregard of the user-perspective by the designer of the VR experience (LaValle, 2020). Related to this, fatigue can arise. Whereas, fatigue is certainly an issue that can be accounted for in animal studies—consider, for example, a treadmill that is too heavy or creates much friction (Dahmen et al., 2017)—analogs of VR sickness in animals may be difficult to determine. Animal VRs can suffer from unnatural feedback from different senses (Dombeck and Reiser, 2011; Thurley and Ayaz, 2017). This is exemplified by the issues of head-fixation with regard to hippocampal space-related activity in rodents discussed above.

6. Conclusions

In this article, I tried to show that VR has a multitude of applications in neuroscience that can help advancing from traditional laboratory-based to naturalistic research themes. VR can mediate between the opposing poles of ecological validity and experimental control, facilitating generalizability of laboratory results to the situation in the wild. As with any scientific approach, the means have to be adapted to the research question. A specific and maybe novel technology or method does not help with this by itself (Minderer et al., 2016; Thurley and Ayaz, 2017). When designing a VR, it is important to consider the specifics of the model species. Only then immersion can be reached, which results in a naturalistic experience and ecological validity. To determine how immersive a VR experience is, only behavioral readout is appropriate, which needs to be compared to real-world behavior. Otherwise, VR experiments will likely elicit unnatural behaviors and neural responses, which are not related to the intended research questions (Krakauer et al., 2017; Powell and Rosenthal, 2017).

Developers of VR for humans, especially for consumer applications or therapy, realized that without knowledge about our senses, our perception and ultimately our brains, it is not possible to build VR (LaValle, 2020). This concept closes the cycle for the present article—and presents a somewhat circular argument for use of VR in naturalistic neuroscience: VR is used in neuroscience to gain insights into perception, behavior, and brain function. However, good VR experiments that are also naturalistic and ecologically valid can only be conducted if the subjects' perception, behavior, and knowledge of their physiological basis are sensibly taken into account.

Author contributions

The author confirms being the sole contributor of this work and has approved it for publication.

Acknowledgments

I wish to thank the two reviewers for their detailed comments and valuable suggestions, which contributed substantially to the improvement of the present article. I am also grateful for ongoing support from the Bernstein Center Munich.

Conflict of interest

The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher's note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

Aghajan, Z. M., Acharya, L., Moore, J. J., Cushman, J. D., Vuong, C., and Mehta, M. R. (2014). Impaired spatial selectivity and intact phase precession in two-dimensional virtual reality. Nat Neurosci. 18, 121–128. doi: 10.1038/nn.3884

PubMed Abstract | CrossRef Full Text | Google Scholar

Ahrens, M. B., Huang, K. H., Narayan, S., Mensh, B. D., and Engert, F. (2013). Two-photon calcium imaging during fictive navigation in virtual environments. Front. Neural Circuits 7:104. doi: 10.3389/fncir.2013.00104

PubMed Abstract | CrossRef Full Text | Google Scholar

Ahrens, M. B., Li, J. M., Orger, M. B., Robson, D. N., Schier, A. F., Engert, F., et al. (2012). Brain-wide neuronal dynamics during motor adaptation in zebrafish. Nature 485, 471–477. doi: 10.1038/nature11057

PubMed Abstract | CrossRef Full Text | Google Scholar

Allritz, M., Call, J., Schweller, K., McEwen, E. S., de Guinea, M., Janmaat, K. R. L., et al. (2022). Chimpanzees (Pan troglodytes) navigate to find hidden fruit in a virtual environment. Sci. Adv. 8:eabm4754. doi: 10.1126/sciadv.abm4754

PubMed Abstract | CrossRef Full Text | Google Scholar

Amaro, D., Ferreiro, D. N., Grothe, B., and Pecka, M. (2021). Source identity shapes spatial preference in primary auditory cortex during active navigation. Curr. Biol. 31, 3875–3883.e5. doi: 10.1016/j.cub.2021.06.025

PubMed Abstract | CrossRef Full Text | Google Scholar

Aronov, D., and Tank, D. W. (2014). Engagement of neural circuits underlying 2D spatial navigation in a rodent virtual reality system. Neuron 84, 442–456. doi: 10.1016/j.neuron.2014.08.042

PubMed Abstract | CrossRef Full Text | Google Scholar

Ayaz, A., Saleem, A. B., Schölvinck, M. L., and Carandini, M. (2013). Locomotion controls spatial integration in mouse visual cortex. Curr. Biol. 23, 890–894. doi: 10.1016/j.cub.2013.04.012

PubMed Abstract | CrossRef Full Text | Google Scholar

Ayaz, A., Stäuble, A., Hamada, M., Wulf, M.-A., Saleem, A. B., and Helmchen, F. (2019). Layer-specific integration of locomotion and sensory information in mouse barrel cortex. Nat. Commun. 10:2585. doi: 10.1038/s41467-019-10564-8

PubMed Abstract | CrossRef Full Text | Google Scholar

Bahl, A., and Engert, F. (2020). Neural circuits for evidence accumulation and decision making in larval zebrafish. Nat. Neurosci. 23, 94–102. doi: 10.1038/s41593-019-0534-9

PubMed Abstract | CrossRef Full Text | Google Scholar

Bohil, C. J., Alicea, B., and Biocca, F. A. (2011). Virtual reality in neuroscience research and therapy. Nat. Rev. Neurosci. 12, 752–762. doi: 10.1038/nrn3122

PubMed Abstract | CrossRef Full Text | Google Scholar

Bucci-Mansilla, G., Vicencio-Jimenez, S., Concha-Miranda, M., and Loyola-Navarro, R. (2021). Challenging paradigms through ecological neuroscience: Lessons from visual models. Front. Neurosci. 15:758388. doi: 10.3389/fnins.2021.758388

PubMed Abstract | CrossRef Full Text | Google Scholar

Carrel, J. S. (1972). An improved treading device for tethered insects. Science 175:1279. doi: 10.1126/science.175.4027.1279.b

CrossRef Full Text | Google Scholar

Chen, G., King, J. A., Burgess, N., and O'Keefe, J. (2013). How vision and movement combine in the hippocampal place code. Proc. Natl. Acad. Sci. U.S.A. 110, 378–383. doi: 10.1073/pnas.1215834110

PubMed Abstract | CrossRef Full Text | Google Scholar

Chen, G., King, J. A., Lu, Y., Cacucci, F., and Burgess, N. (2018). Spatial cell firing during virtual navigation of open arenas by head-restrained mice. eLife 7:e26. doi: 10.7554/eLife.34789.026

PubMed Abstract | CrossRef Full Text | Google Scholar

Chouinard-Thuly, L., Gierszewski, S., Rosenthal, G. G., Reader, S. M., Rieucau, G., Woo, K. L., et al. (2017). Technical and conceptual considerations for using animated stimuli in studies of animal behavior. Curr. Zool. 63, 5–19. doi: 10.1093/cz/zow104

PubMed Abstract | CrossRef Full Text | Google Scholar

Clancy, K. B., Orsolic, I., and Mrsic-Flogel, T. D. (2019). Locomotion-dependent remapping of distributed cortical networks. Nat. Neurosci. 22, 778–786. doi: 10.1038/s41593-019-0357-8

PubMed Abstract | CrossRef Full Text | Google Scholar

Culbertson, H., Schorr, S. B., and Okamura, A. M. (2018). Haptics: the present and future of artificial touch sensation. Annu. Rev. Control Robot. Auton. Syst. 1, 385–409. doi: 10.1146/annurev-control-060117-105043

CrossRef Full Text | Google Scholar

Cushman, J. D., Aharoni, D. B., Willers, B., Ravassard, P., Kees, A., Vuong, C., et al. (2013). Multisensory control of multimodal behavior: do the legs know what the tongue is doing? PLoS ONE 8:e80465. doi: 10.1371/journal.pone.0080465

PubMed Abstract | CrossRef Full Text | Google Scholar

Dadarlat, M. C., and Stryker, M. P. (2017). Locomotion enhances neural encoding of visual stimuli in mouse v1. J. Neurosci. 37, 3764–3775. doi: 10.1523/JNEUROSCI.2728-16.2017

PubMed Abstract | CrossRef Full Text | Google Scholar

Dahmen, H. (1980). A simple apparatus to investigate the orientation of walking insects. Cell. Mol. Life Sci. 36, 685–687. doi: 10.1007/BF01970140

CrossRef Full Text | Google Scholar

Dahmen, H., Wahl, V. L., Pfeffer, S. E., Mallot, H. A., and Wittlinger, M. (2017). Naturalistic path integration of Cataglyphis desert ants on an air-cushioned lightweight spherical treadmill. J. Exp. Biol. 220, 634–644. doi: 10.1242/jeb.148213

PubMed Abstract | CrossRef Full Text | Google Scholar

De Lillo, C., Kirby, M., and James, F. C. (2014). Spatial working memory in immersive virtual reality foraging: path organization, traveling distance and search efficiency in humans (Homo sapiens). Am. J. Primatol. 76, 436–446. doi: 10.1002/ajp.22195

PubMed Abstract | CrossRef Full Text | Google Scholar

Del Grosso, N. A., Graboski, J. J., Chen, W., Blanco-Hernández, E., and Sirota, A. (2017). Virtual reality system for freely-moving rodents. bioRxiv [preprint]. doi: 10.1101/161232

CrossRef Full Text | Google Scholar

Dennis, E. J., El Hady, A., Michaiel, A., Clemens, A., Tervo, D. R. G., Voigts, J., and Datta, S. R. (2021). Systems neuroscience of natural behaviors in rodents. J. Neurosci. 41, 911–919. doi: 10.1523/JNEUROSCI.1877-20.2020

PubMed Abstract | CrossRef Full Text | Google Scholar

Dokka, K., MacNeilage, P. R., DeAngelis, G. C., and Angelaki, D. E. (2011). Estimating distance during self-motion: a role for visual-vestibular interactions. J. Vis. 11, 1–16. doi: 10.1167/11.13.2

PubMed Abstract | CrossRef Full Text | Google Scholar

Dolins, F. L., Klimowicz, C., Kelley, J., and Menzel, C. R. (2014). Using virtual reality to investigate comparative spatial cognitive abilities in chimpanzees and humans. Am. J. Primatol. 76, 496–513. doi: 10.1002/ajp.22252

PubMed Abstract | CrossRef Full Text | Google Scholar

Dolins, F. L., Schweller, K., and Milne, S. (2017). Technology advancing the study of animal cognition: using virtual reality to present virtually simulated environments to investigate nonhuman primate spatial cognition. Curr. Zool. 63, 97–108. doi: 10.1093/cz/zow121

PubMed Abstract | CrossRef Full Text | Google Scholar

Dombeck, D. A., Harvey, C. D., Tian, L., Looger, L. L., and Tank, D. W. (2010). Functional imaging of hippocampal place cells at cellular resolution during virtual navigation. Nat. Neurosci. 13, 1433–1440. doi: 10.1038/nn.2648

PubMed Abstract | CrossRef Full Text | Google Scholar

Dombeck, D. A., and Reiser, M. B. (2011). Real neuroscience in virtual worlds. Curr. Opin. Neurobiol. 22, 3–10. doi: 10.1016/j.conb.2011.10.015

PubMed Abstract | CrossRef Full Text | Google Scholar

Domnisoru, C., Kinkhabwala, A. A., and Tank, D. W. (2013). Membrane potential dynamics of grid cells. Nature 495, 199–204. doi: 10.1038/nature11973

PubMed Abstract | CrossRef Full Text | Google Scholar

Dragomir, E. I., Štih, V., and Portugues, R. (2020). Evidence accumulation during a sensorimotor decision task revealed by whole-brain imaging. Nat. Neurosci. 23, 85–93. doi: 10.1038/s41593-019-0535-8

PubMed Abstract | CrossRef Full Text | Google Scholar

Faumont, S., Rondeau, G., Thiele, T. R., Lawton, K. J., McCormick, K. E., Sottile, M., et al. (2011). An image-free opto-mechanical system for creating virtual environments and imaging neuronal activity in freely moving Caenorhabditis elegans. PLOS ONE 6:e24666. doi: 10.1371/journal.pone.0024666

PubMed Abstract | CrossRef Full Text | Google Scholar

Fayat, R., Delgado Betancourt, V., Goyallon, T., Petremann, M., Liaudet, P., Descossy, V., et al. (2021). Inertial measurement of head tilt in rodents: principles and applications to vestibular research. Sensors 21:6318. doi: 10.3390/s21186318

PubMed Abstract | CrossRef Full Text | Google Scholar

Ferreiro, D. N., Amaro, D., Schmidtke, D., Sobolev, A., Gundi, P., Belliveau, L., et al. (2020). Sensory island task (sit): a new behavioral paradigm to study sensory perception and neural processing in freely moving animals. Front. Behav. Neurosci. 14:576154. doi: 10.3389/fnbeh.2020.576154

PubMed Abstract | CrossRef Full Text | Google Scholar

Fischler-Ruiz, W., Clark, D. G., Joshi, N. R., Devi-Chou, V., Kitch, L., Schnitzer, M., et al. (2021). Olfactory landmarks and path integration converge to form a cognitive spatial map. Neuron, 109, 4036–4049.e5. doi: 10.1016/j.neuron.2021.09.055

PubMed Abstract | CrossRef Full Text | Google Scholar

Fry, S. N., Rohrseitz, N., Straw, A. D., and Dickinson, M. H. (2008). Trackfly: virtual reality for a behavioral system analysis in free-flying fruit flies. J. Neurosci. Methods 171, 110–117. doi: 10.1016/j.jneumeth.2008.02.016

PubMed Abstract | CrossRef Full Text | Google Scholar

Fu, Y., Tucciarone, J. M., Espinosa, J. S., Sheng, N., Darcy, D. P., Nicoll, R. A., et al. (2014). A cortical circuit for gain control by behavioral state. Cell 156, 1139–1152. doi: 10.1016/j.cell.2014.01.050

PubMed Abstract | CrossRef Full Text | Google Scholar

Funamizu, A., Kuhn, B., and Doya, K. (2016). Neural substrate of dynamic bayesian inference in the cerebral cortex. Nat. Neurosci. 19, 1682–1689. doi: 10.1038/nn.4390

PubMed Abstract | CrossRef Full Text | Google Scholar

Garbers, C., Henke, J., Leibold, C., Wachtler, T., and Thurley, K. (2015). Contextual processing of brightness and color in Mongolian gerbils. J. Vis. 15:13. doi: 10.1167/15.1.13

PubMed Abstract | CrossRef Full Text | Google Scholar

Garzorz, I. T., and MacNeilage, P. R. (2017). Visual-vestibular conflict detection depends on fixation. Curr. Biol. 27, 2856–2861.e4. doi: 10.1016/j.cub.2017.08.011

PubMed Abstract | CrossRef Full Text | Google Scholar

Geiller, T., Fattahi, M., Choi, J.-S., and Royer, S. (2017). Place cells are more strongly tied to landmarks in deep than in superficial ca1. Nat. Commun. 8:14531. doi: 10.1038/ncomms14531

PubMed Abstract | CrossRef Full Text | Google Scholar

Genzel, D., Firzlaff, U., Wiegrebe, L., and MacNeilage, P. R. (2016). Dependence of auditory spatial updating on vestibular, proprioceptive, and efference copy signals. J. Neurophysiol. 116, 765–775. doi: 10.1152/jn.00052.2016

PubMed Abstract | CrossRef Full Text | Google Scholar

Genzel, D., Schutte, M., Brimijoin, W. O., MacNeilage, P. R., and Wiegrebe, L. (2018). Psychophysical evidence for auditory motion parallax. Proc. Natl. Acad. Sci. U.S.A. 115, 4264–4269. doi: 10.1073/pnas.1712058115

PubMed Abstract | CrossRef Full Text | Google Scholar

Gierszewski, S., Müller, K., Smielik, I., Hütwohl, J.-M., Kuhnert, K.-D., and Witte, K. (2017). The virtual lover: variable and easily guided 3d fish animations as an innovative tool in mate-choice experiments with Sailfin mollies-II. Validation. Curr. Zool. 63, 65–74. doi: 10.1093/cz/zow108

PubMed Abstract | CrossRef Full Text | Google Scholar

Gottlieb, J., and Oudeyer, P.-Y. (2018). Towards a neuroscience of active sampling and curiosity. Nat. Rev. Neurosci. 19, 758–770. doi: 10.1038/s41583-018-0078-0

PubMed Abstract | CrossRef Full Text | Google Scholar

Graving, J. M., Chae, D., Naik, H., Li, L., Koger, B., Costelloe, B. R., et al. (2019). Deepposekit, a software toolkit for fast and robust animal pose estimation using deep learning. eLife 8:e47994. doi: 10.7554/eLife.47994.sa2

PubMed Abstract | CrossRef Full Text | Google Scholar

Gray, J. R., Pawlowski, V., and Willis, M. A. (2002). A method for recording behavior and multineuronal CNS activity from tethered insects flying in virtual space. J. Neurosci. Methods 120, 211–223. doi: 10.1016/S0165-0270(02)00223-6

PubMed Abstract | CrossRef Full Text | Google Scholar

Gu, Y., Fetsch, C. R., Adeyemo, B., Deangelis, G. C., and Angelaki, D. E. (2010). Decoding of MSTD population activity accounts for variations in the precision of heading perception. Neuron 66, 596–609. doi: 10.1016/j.neuron.2010.04.026

PubMed Abstract | CrossRef Full Text | Google Scholar

Haas, O. V., Henke, J., Leibold, C., and Thurley, K. (2019). Modality-specific subpopulations of place fields coexist in the hippocampus. Cereb. Cortex 29, 1109–1120. doi: 10.1093/cercor/bhy017

PubMed Abstract | CrossRef Full Text | Google Scholar

Haberkern, H., Basnak, M. A., Ahanonu, B., Schauder, D., Cohen, J. D., Bolstad, M., et al. (2019). Visually guided behavior and optogenetically induced learning in head-fixed flies exploring a virtual landscape. Curr. Biol. 29, 1647–1659.e8. doi: 10.1016/j.cub.2019.04.033

PubMed Abstract | CrossRef Full Text | Google Scholar

Harvey, C. D., Coen, P., and Tank, D. W. (2012). Choice-specific sequences in parietal cortex during a virtual-navigation decision task. Nature 484, 62–68. doi: 10.1038/nature10918

PubMed Abstract | CrossRef Full Text | Google Scholar

Hedrick, T. L. (2008). Software techniques for two- and three-dimensional kinematic measurements of biological and biomimetic systems. Bioinspir. Biomimet. 3:034001. doi: 10.1088/1748-3182/3/3/034001

PubMed Abstract | CrossRef Full Text | Google Scholar

Henke, J., Bunk, D., von Werder, D., Häusler, S., Flanagin, V. L., and Thurley, K. (2021). Distributed coding of duration in rodent prefrontal cortex during time reproduction. eLife 10:e71612. doi: 10.7554/eLife.71612

PubMed Abstract | CrossRef Full Text | Google Scholar

Henke, J., Flanagin, V. L., and Thurley, K. (2022). A virtual reality time reproduction task for rodents. Front. Behav. Neurosci. 16:957804. doi: 10.3389/fnbeh.2022.957804

PubMed Abstract | CrossRef Full Text | Google Scholar

Hofer, M., Hartmann, T., Eden, A., Ratan, R., and Hahn, L. (2020). The role of plausibility in the experience of spatial presence in virtual environments. Front. Virt. Real. 1:2. doi: 10.3389/frvir.2020.00002

CrossRef Full Text | Google Scholar

Hölscher, C., Schnee, A., Dahmen, H., Setia, L., and Mallot, H. A. (2005). Rats are able to navigate in virtual environments. J. Exp. Biol. 208(Pt 3), 561–569. doi: 10.1242/jeb.01371

PubMed Abstract | CrossRef Full Text | Google Scholar

Ioannou, C. C., Guttal, V., and Couzin, I. D. (2012). Predatory fish select for coordinated collective motion in virtual prey. Science 337, 1212–1215. doi: 10.1126/science.1218919

PubMed Abstract | CrossRef Full Text | Google Scholar

Jayakumar, R. P., Madhav, M. S., Savelli, F., Blair, H. T., Cowan, N. J., and Knierim, J. J. (2019). Recalibration of path integration in hippocampal place cells. Nature 566, 533–537. doi: 10.1038/s41586-019-0939-3

PubMed Abstract | CrossRef Full Text | Google Scholar

Kaupert, U., Thurley, K., Frei, K., Bagorda, F., Schatz, A., Tocker, G., et al. (2017). Spatial cognition in a virtual reality home-cage extension for freely moving rodents. J. Neurophysiol. 117, 1736–1748. doi: 10.1152/jn.00630.2016

PubMed Abstract | CrossRef Full Text | Google Scholar

Kaushik, P. K., Renz, M., and Olsson, S. B. (2020). Characterizing long-range search behavior in diptera using complex 3d virtual environments. Proc. Natl. Acad. Sci. U.S.A. 117, 12201–12207. doi: 10.1073/pnas.1912124117

PubMed Abstract | CrossRef Full Text | Google Scholar

Kautzky, M., and Thurley, K. (2016). Estimation of self-motion duration and distance in rodents. R. Soc. Open Sci. 3:160118. doi: 10.1098/rsos.160118

PubMed Abstract | CrossRef Full Text | Google Scholar

Kerekes, P., Daret, A., Shulz, D. E., and Ego-Stengel, V. (2017). Bilateral discrimination of tactile patterns without whisking in freely running rats. J. Neurosci. 37, 7567–7579. doi: 10.1523/JNEUROSCI.0528-17.2017

PubMed Abstract | CrossRef Full Text | Google Scholar

Kerruish, E. (2019). Arranging sensations: smell and taste in augmented and virtual reality. Senses Soc. 14, 31–45. doi: 10.1080/17458927.2018.1556952

CrossRef Full Text | Google Scholar

Knorr, A. G., Gravot, C. M., Gordy, C., Glasauer, S., and Straka, H. (2018). I spy with my little eye: a simple behavioral assay to test color sensitivity on digital displays. Biol. Open 7:bio035725. doi: 10.1242/bio.035725

PubMed Abstract | CrossRef Full Text | Google Scholar

Krakauer, J. W., Ghazanfar, A. A., Gomez-Marin, A., MacIver, M. A., and Poeppel, D. (2017). Neuroscience needs behavior: correcting a reductionist bias. Neuron 93, 480–490. doi: 10.1016/j.neuron.2016.12.041

PubMed Abstract | CrossRef Full Text | Google Scholar

Laczó, J., Andel, R., Vyhnalek, M., Vlcek, K., Magerova, H., Varjassyova, A., et al. (2010). Human analogue of the Morris water maze for testing subjects at risk of Alzheimer's disease. Neuro-degener. Dis. 7, 148–152. doi: 10.1159/000289226

PubMed Abstract | CrossRef Full Text | Google Scholar

Lakshminarasimhan, K. J., Avila, E., Neyhart, E., DeAngelis, G. C., Pitkow, X., and Angelaki, D. E. (2020). Tracking the mind's eye: primate gaze behavior during virtual visuomotor navigation reflects belief dynamics. Neuron 106, 662–674.e5. doi: 10.1016/j.neuron.2020.02.023

PubMed Abstract | CrossRef Full Text | Google Scholar

Lakshminarasimhan, K. J., Petsalis, M., Park, H., DeAngelis, G. C., Pitkow, X., and Angelaki, D. E. (2018). A dynamic Bayesian observer model reveals origins of bias in visual path integration. Neuron 99, 194–206.e5. doi: 10.1016/j.neuron.2018.05.040

PubMed Abstract | CrossRef Full Text | Google Scholar

LaValle, S. (2020). Virtual Reality. Available online at: http://lavalle.pl/vr/

Google Scholar

Leinweber, M., Zmarz, P., Buchmann, P., Argast, P., Hübener, M., Bonhoeffer, T., et al. (2014). Two-photon calcium imaging in mice navigating a virtual reality environment. J. Vis. Exp. 2014:e50885. doi: 10.3791/50885

PubMed Abstract | CrossRef Full Text | Google Scholar

Lenormand, D., and Piolino, P. (2022). In search of a naturalistic neuroimaging approach: exploration of general feasibility through the case of VR-fMRI and application in the domain of episodic memory. Neurosci. Biobehav. Rev. 133:104499. doi: 10.1016/j.neubiorev.2021.12.022

PubMed Abstract | CrossRef Full Text | Google Scholar

Madhav, M. S., Jayakumar, R. P., Lashkari, S. G., Savelli, F., Blair, H. T., Knierim, J. J., et al. (2022). The Dome: a virtual reality apparatus for freely locomoting rodents. J. Neurosci. Methods 368:109336. doi: 10.1016/j.jneumeth.2021.109336

PubMed Abstract | CrossRef Full Text | Google Scholar

Maimon, G., Straw, A. D., and Dickinson, M. H. (2010). Active flight increases the gain of visual motion processing in drosophila. Nat. Neurosci. 13, 393–399. doi: 10.1038/nn.2492

PubMed Abstract | CrossRef Full Text | Google Scholar

Mathis, M. W., and Mathis, A. (2020). Deep learning tools for the measurement of animal behavior in neuroscience. Curr. Opin. Neurobiol. 60, 1–11. doi: 10.1016/j.conb.2019.10.008

PubMed Abstract | CrossRef Full Text | Google Scholar

Meijer, J. H., and Robbers, Y. (2014). Wheel running in the wild. Proc. R. Soc. B 281:20140210. doi: 10.1098/rspb.2014.0210

PubMed Abstract | CrossRef Full Text | Google Scholar

Meyer, A. F., Poort, J., O'Keefe, J., Sahani, M., and Linden, J. F. (2018). A head-mounted camera system integrates detailed behavioral monitoring with multichannel electrophysiology in freely moving mice. Neuron 100, 46–60.e7. doi: 10.1016/j.neuron.2018.09.020

PubMed Abstract | CrossRef Full Text | Google Scholar

Micaroni, L., Carulli, M., Ferrise, F., Gallace, A., and Bordegoni, M. (2019). An olfactory display to study the integration of vision and olfaction in a virtual reality environment. J. Comput. Inform. Sci. Eng. 19:031015. doi: 10.1115/1.4043068

CrossRef Full Text | Google Scholar

Minderer, M., Harvey, C. D., Donato, F., and Moser, E. I. (2016). Neuroscience: virtual reality explored. Nature 533, 324–325. doi: 10.1038/nature17899

PubMed Abstract | CrossRef Full Text | Google Scholar

Naik, H., Bastien, R., Navab, N., and Couzin, I. D. (2020). Animals in virtual environments. IEEE Trans. Visual. Comput. Graph. 26, 2073–2083. doi: 10.1109/TVCG.2020.2973063

PubMed Abstract | CrossRef Full Text | Google Scholar

Narumi, T., Nishizaka, S., Kajinami, T., Tanikawa, T., and Hirose, M. (2011). “Augmented reality flavors: gustatory display based on edible marker and cross-modal interaction,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '11 (New York, NY: Association for Computing Machinery), 93–102. doi: 10.1145/1978942.1978957

CrossRef Full Text | Google Scholar

Niell, C. M., and Stryker, M. P. (2010). Modulation of visual responses by behavioral state in mouse visual cortex. Neuron 65, 472–479. doi: 10.1016/j.neuron.2010.01.033

PubMed Abstract | CrossRef Full Text | Google Scholar

Nityananda, V., Tarawneh, G., Rosner, R., Nicolas, J., Crichton, S., and Read, J. (2016). Insect stereopsis demonstrated using a 3D insect cinema. Sci. Rep. 6:18718. doi: 10.1038/srep18718

PubMed Abstract | CrossRef Full Text | Google Scholar

Osorio, D., and Vorobyev, M. (2005). Photoreceptor sectral sensitivities in terrestrial animals: adaptations for luminance and colour vision. Proc. R. Soc. B Biol. Sci. 272, 1745–1752. doi: 10.1098/rspb.2005.3156

PubMed Abstract | CrossRef Full Text | Google Scholar

Parsons, T. D. (2015). Virtual reality for enhanced ecological validity and experimental control in the clinical, affective and social neurosciences. Front. Hum. Neurosci. 9:660. doi: 10.3389/fnhum.2015.00660

PubMed Abstract | CrossRef Full Text | Google Scholar

Peckmezian, T., and Taylor, P. W. (2015). A virtual reality paradigm for the study of visually mediated behaviour and cognition in spiders. Anim. Behav. 107, 87–95. doi: 10.1016/j.anbehav.2015.06.018

CrossRef Full Text | Google Scholar

Petzschner, F. H., and Glasauer, S. (2011). Iterative Bayesian estimation as an explanation for range and regression effects: a study on human path integration. J. Neurosci. 31, 17220–17229. doi: 10.1523/JNEUROSCI.2028-11.2011

PubMed Abstract | CrossRef Full Text | Google Scholar

Portugues, R., and Engert, F. (2011). Adaptive locomotor behavior in larval Zebrafish. Front. Syst. Neurosci. 5:72. doi: 10.3389/fnsys.2011.00072

PubMed Abstract | CrossRef Full Text | Google Scholar

Powell, D. L., and Rosenthal, G. G. (2017). What artifice can and cannot tell us about animal behavior. Curr. Zool. 63, 21–26. doi: 10.1093/cz/zow091

PubMed Abstract | CrossRef Full Text | Google Scholar

Radvansky, B. A., and Dombeck, D. A. (2018). An olfactory virtual reality system for mice. Nat. Commun. 9:839. doi: 10.1038/s41467-018-03262-4

PubMed Abstract | CrossRef Full Text | Google Scholar

Radvansky, B. A., Oh, J. Y., Climer, J. R., and Dombeck, D. A. (2021). Behavior determines the hippocampal spatial mapping of a multisensory environment. Cell Rep. 36:109444. doi: 10.1016/j.celrep.2021.109444

PubMed Abstract | CrossRef Full Text | Google Scholar

Ravassard, P., Kees, A., Willers, B., Ho, D., Aharoni, D., Cushman, J., et al. (2013). Multisensory control of hippocampal spatiotemporal selectivity. Science 340, 1342–1346. doi: 10.1126/science.1232655

PubMed Abstract | CrossRef Full Text | Google Scholar

Reiser, M. B., and Dickinson, M. H. (2008). A modular display system for insect behavioral neuroscience. J. Neurosci. Methods 167, 127–139. doi: 10.1016/j.jneumeth.2007.07.019

PubMed Abstract | CrossRef Full Text | Google Scholar

Robie, A. A., Seagraves, K. M., Egnor, S. E. R., Branson, K., Levine, J. D., Kronauer, D. J. C., et al. (2017). Machine vision methods for analyzing social interactions. J. Exp. Biol. 220, 25–34. doi: 10.1242/jeb.142281

PubMed Abstract | CrossRef Full Text | Google Scholar

Robinson, E. M., and Wiener, M. (2021). Dissociable neural indices for time and space estimates during virtual distance reproduction. NeuroImage 226:117607. doi: 10.1016/j.neuroimage.2020.117607

PubMed Abstract | CrossRef Full Text | Google Scholar

Safaryan, K., and Mehta, M. R. (2021). Enhanced hippocampal theta rhythmicity and emergence of eta oscillation in virtual reality. Nat. Neurosci. 24, 1065–1070. doi: 10.1038/s41593-021-00871-z

PubMed Abstract | CrossRef Full Text | Google Scholar

Saleem, A. B., Ayaz, A., Jeffery, K. J., Harris, K. D., and Carandini, M. (2013). Integration of visual motion and locomotion in mouse visual cortex. Nat. Neurosci. 16, 1864–1869. doi: 10.1038/nn.3567

PubMed Abstract | CrossRef Full Text | Google Scholar

Salminen, K., Rantala, J., Isokoski, P., Lehtonen, M., Müller, P., Karjalainen, M., et al. (2018). “Olfactory display prototype for presenting and sensing authentic and synthetic odors,” in Proceedings of the 20th ACM International Conference on Multimodal Interaction, ICMI '18 (New York, NY: Association for Computing Machinery), 73–77. doi: 10.1145/3242969.3242999

CrossRef Full Text | Google Scholar

Sato, N., Sakata, H., Tanaka, Y., and Taira, M. (2004). Navigation in virtual environment by the macaque monkey. Behav. Brain Res. 153, 287–291. doi: 10.1016/j.bbr.2003.10.026

PubMed Abstract | CrossRef Full Text | Google Scholar

Schmidt-Hieber, C., and Häusser, M. (2013). Cellular mechanisms of spatial navigation in the medial entorhinal cortex. Nat. Neurosci. 16, 325–331. doi: 10.1038/nn.3340

PubMed Abstract | CrossRef Full Text | Google Scholar

Schneider, D. M., and Mooney, R. (2018). How movement modulates hearing. Annu. Rev. Neurosci. 41, 553–572. doi: 10.1146/annurev-neuro-072116-031215

PubMed Abstract | CrossRef Full Text | Google Scholar

Seeber, B. U., Kerber, S., and Hafter, E. R. (2010). A system to simulate and reproduce audio-visual environments for spatial hearing research. Hear. Res. 260, 1–10. doi: 10.1016/j.heares.2009.11.004

PubMed Abstract | CrossRef Full Text | Google Scholar

Seelig, J. D., Chiappe, M. E., Lott, G. K., Dutta, A., Osborne, J. E., Reiser, M. B., et al. (2010). Two-photon calcium imaging from head-fixed drosophila during optomotor walking behavior. Nat. Methods 7, 535–540. doi: 10.1038/nmeth.1468

PubMed Abstract | CrossRef Full Text | Google Scholar

Sherman, A., and Dickinson, M. H. (2003). A comparison of visual and haltere-mediated equilibrium reflexes in the fruit fly drosophila melanogaster. J. Exp. Biol. 206, 295–302. doi: 10.1242/jeb.00075

PubMed Abstract | CrossRef Full Text | Google Scholar

Sofroniew, N. J., Cohen, J. D., Lee, A. K., and Svoboda, K. (2014). Natural whisker-guided behavior by head-fixed mice in tactile virtual reality. J. Neurosci. 34, 9537–9550. doi: 10.1523/JNEUROSCI.0712-14.2014

PubMed Abstract | CrossRef Full Text | Google Scholar

Sonkusare, S., Breakspear, M., and Guo, C. (2019). Naturalistic stimuli in neuroscience: critically acclaimed. Trends Cogn. Sci. 23, 699–714. doi: 10.1016/j.tics.2019.05.004

PubMed Abstract | CrossRef Full Text | Google Scholar

Steckenfinger, S. A., and Ghazanfar, A. A. (2009). Monkey visual behavior falls into the uncanny valley. Proc. Natl. Acad. Sci. U.S.A. 106, 18362–18366. doi: 10.1073/pnas.0910063106

PubMed Abstract | CrossRef Full Text | Google Scholar

Stowers, J. R., Hofbauer, M., Bastien, R., Griessner, J., Higgins, P., Farooqui, S., et al. (2017). Virtual reality for freely moving animals. Nat. Methods 14, 995–1002. doi: 10.1038/nmeth.4399

PubMed Abstract | CrossRef Full Text | Google Scholar

Taillade, M., N'Kaoua, B., and Gross, C. (2019). Navigation strategy in macaque monkeys: an exploratory experiment in virtual reality. J. Neurosci. Methods 326:108336. doi: 10.1016/j.jneumeth.2019.108336

PubMed Abstract | CrossRef Full Text | Google Scholar

Takalo, J., Piironen, A., Honkanen, A., Lempeä, M., Aikio, M., Tuukkanen, T., et al. (2012). A fast and flexible panoramic virtual reality system for behavioural and electrophysiological experiments. Sci. Rep. 2:324. doi: 10.1038/srep00324

PubMed Abstract | CrossRef Full Text | Google Scholar

Tennant, S. A., Fischer, L., Garden, D. L. F., Gerlei, K. Z., Martinez-Gonzalez, C., McClure, C., et al. (2018). Stellate cells in the medial entorhinal cortex are required for spatial learning. Cell Rep. 22, 1313–1324. doi: 10.1016/j.celrep.2018.01.005

PubMed Abstract | CrossRef Full Text | Google Scholar

Thurley, K., and Ayaz, A. (2017). Virtual reality systems for rodents. Curr. Zool. 63, 109–119. doi: 10.1093/cz/zow070

PubMed Abstract | CrossRef Full Text | Google Scholar

Thurley, K., and Schild, U. (2018). Time and distance estimation in children using an egocentric navigation task. Sci. Rep. 8:18001. doi: 10.1038/s41598-018-36234-1

PubMed Abstract | CrossRef Full Text | Google Scholar

Trivedi, C. A., and Bollmann, J. H. (2013). Visually driven chaining of elementary swim patterns into a goal-directed motor sequence: a virtual reality study of zebrafish prey capture. Front. Neural Circuits 7:86. doi: 10.3389/fncir.2013.00086

PubMed Abstract | CrossRef Full Text | Google Scholar

Vagvolgyi, B. P., Jayakumar, R. P., Madhav, M. S., Knierim, J. J., and Cowan, N. J. (2022). Wide-angle, monocular head tracking using passive markers. J. Neurosci. Methods 368:109453. doi: 10.1016/j.jneumeth.2021.109453

PubMed Abstract | CrossRef Full Text | Google Scholar

Valero, M., and English, D. F. (2019). Head-mounted approaches for targeting single-cells in freely moving animals. J. Neurosci. Methods 326:108397. doi: 10.1016/j.jneumeth.2019.108397

PubMed Abstract | CrossRef Full Text | Google Scholar

Venkatraman, S., Jin, X., Costa, R. M., and Carmena, J. M. (2010). Investigating neural correlates of behavior in freely behaving rodents using inertial sensors. J. Neurophysiol. 104, 569–575. doi: 10.1152/jn.00121.2010

PubMed Abstract | CrossRef Full Text | Google Scholar

Vi, C. T., Ablart, D., Arthur, D., and Obrist, M. (2017). “Gustatory interface: the challenges of ‘how' to stimulate the sense of taste,” in Proceedings of the 2nd ACM SIGCHI International Workshop on Multisensory Approaches to Human-Food Interaction, MHFI 2017 (New York, NY: Association for Computing Machinery), 29–33. doi: 10.1145/3141788.3141794

CrossRef Full Text | Google Scholar

Wang, D., Guo, Y., Liu, S., Zhang, Y., Xu, W., and Xiao, J. (2019). Haptic display for virtual reality: progress and challenges. Virt. Real. Intell. Hardw. 1, 136–162. doi: 10.3724/SP.J.2096-5796.2019.0008

CrossRef Full Text | Google Scholar

Washburn, D. A., and Astur, R. S. (2003). Exploration of virtual mazes by rhesus monkeys (Macaca mulatta). Anim. Cogn. 6, 161–168. doi: 10.1007/s10071-003-0173-z

PubMed Abstract | CrossRef Full Text | Google Scholar

Yu, H., Senarathna, J., Tyler, B. M., Thakor, N. V., and Pathak, A. P. (2015). Miniaturized optical neuroimaging in unrestrained animals. NeuroImage 113, 397–406. doi: 10.1016/j.neuroimage.2015.02.070

PubMed Abstract | CrossRef Full Text | Google Scholar

Keywords: virtual reality, naturalistic behavior, naturalistic neuroscience, ecological validity, animal behavior, behavioral neuroscience

Citation: Thurley K (2022) Naturalistic neuroscience and virtual reality. Front. Syst. Neurosci. 16:896251. doi: 10.3389/fnsys.2022.896251

Received: 14 March 2022; Accepted: 31 October 2022;
Published: 17 November 2022.

Edited by:

Susanne Hoffmann, Max Planck Institute for Ornithology, Germany

Reviewed by:

Ronen Segev, Ben-Gurion University of the Negev, Israel
Manu Madhav, University of British Columbia, Canada

Copyright © 2022 Thurley. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Kay Thurley, dGh1cmxleSYjeDAwMDQwO2Jpby5sbXUuZGU=

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.