- 1Department of Psychology, Uppsala University, Uppsala, Sweden
- 2Goodbye Kansas Studios, Uppsala, Sweden
There is a need for large-scale remote data collection in a controlled environment, and the in-home availability of virtual reality (VR) and the commercial availability of eye tracking for VR present unique and exciting opportunities for researchers. We propose and provide a proof-of-concept assessment of a robust system for large-scale in-home testing using consumer products that combines psychophysiological measures and VR, here referred to as a Virtual Lab. For the first time, this method is validated by correlating autonomic responses, skin conductance response (SCR), and pupillary dilation, in response to a spider, a beetle, and a ball using commercially available VR. Participants demonstrated greater SCR and pupillary responses to the spider, and the effect was dependent on the proximity of the stimuli to the participant, with a stronger response when the spider was close to the virtual self. We replicated these effects across two experiments and in separate physical room contexts to mimic variability in home environment. Together, these findings demonstrate the utility of pupil dilation as a marker of autonomic arousal and the feasibility to assess this in commercially available VR hardware and support a robust Virtual Lab tool for massive remote testing.
Introduction
Virtual Reality (VR) is defined as “an advanced form of human–computer interface that allows the user to interact with and become immersed in a computer-generated environment in a naturalistic fashion” (Schultheis and Rizzo, 2001). Unlike lab scenarios, video stimuli, or even augmented reality, VR is unique in that it is at the furthest end of the reality continuum (Milgram and Kishino, 1994) replacing real-world environments with virtual contexts. This allows for levels of stimulus control that surpass lab testing, absolute control of colors, textures, and luminance (Riva et al., 2016). The addition of integrated eye tracking, which is currently available to control interfaces and guide avatars in games (e.g., FOVE Eye Tracking VR Headset), opens up for measuring psychophysiological responses remotely on a large scale. In their extensive review of the VR literature, Lindner et al. (2017) concluded that eye tracking is becoming an important new technology in commercially available VR. The widespread use of VR by the public, in research, and in therapy is creating a need for more high-quality empirical studies examining VR and its capability for naturalistic “Big Data.”
In this paper, we propose and provide a proof-of-concept assessment of a robust system for large-scale in-home testing using consumer products that combine psychophysiological measures and VR, here referred to as a Virtual Lab. Specifically, our first aim is to simultaneously test and correlate two autonomic measures: skin conductance response (SCR), a well-established autonomic measure that has been reliably used in previous VR studies, and pupil dilation, a measure which has been demonstrated as a reliable autonomic measure but has yet to be tested and validated in VR. Our second aim is to demonstrate that these measures can be reliably recorded independent of physical location, demonstrating possibilities for remote testing. For a Virtual Lab to be a feasible reality in scientific research, it is important to establish that: (a) there is a demand for remote data collection on a large scale, (b) there is a wide availability of VR equipment in homes, and (c) there is a way to measure autonomic responses in a reliable and robust manner through the VR device. Each of which will be discussed briefly in the following sections.
Remote Data Collection
Researchers across scientific fields have, to a large degree, relied on lab testing, which offers good control; however, it is difficult to use in/with remote demographics when collecting large samples. There is a demand for the availability of remote testing (eye tracking on tablet devices, Holland and Komogortsev, 2012; mobile eye tracking, Bulling and Gellersen, 2010; mobile phone testing, Tomlinson et al., 2009), but these efforts have been unable to fully bring a controlled testing environment to subjects remotely. As part of an effort to collect more data easily, many researchers have adopted massive home testing in the form of online testing and crowd sourcing systems, such as survey websites and Amazon MTurk (Buhrmester et al., 2011), which take advantage of online infrastructures to reach as many users as possible (the average MTurk study can reach approximately 7,300 participants; Stewart et al., 2015) at very low costs. Remote online platforms like MTurk, however, have clear weaknesses when it comes to the generalizability and reliability of responses (Goodman et al., 2013; Landers and Behrend, 2015) and lack experimental control and the richness of physiological measures. It has been proposed that VR offers the potential for self-help applications and remote therapy for patients who suffer from anxiety disorders; this collection of naturalistic data may further inform learning theory and behavioral therapy (Lindner et al., 2017).
In-Home Availability and Viability
The immersion and accessibility of VR has already been applied in many experiments and labs around the world. VR has been used to experimentally examine autonomic responses to social, proximal, and conditional threat (Rosén et al., 2017), and a recent review of 27 meta-analyses and systematic reviews (Riva et al., 2016) demonstrate that VR has been used in a large array of clinical settings (including anxiety disorders, Gorini and Riva, 2008; stress-related disorders, Botella et al., 2015; phobias, Parsons and Rizzo, 2008; panic disorders, Opriş et al., 2012; addiction, Hone-Blanchet et al., 2014; body-image disorders, Ferrer-García and Gutiérrez-Maldonado, 2012; autistic spectrum disorder, Aresti-Bartolome and Garcia-Zapirain, 2014). To date, the dependent variable being assessed in these studies are not derived from the VR device but measured outside of VR by external devices (e.g., Slater et al., 2010), clinical assessment (e.g., Carlin et al., 1997), or self-report surveys (e.g., Botella et al., 1998) requiring the participants to perform these studies in a lab with trained experimenters. During the last decade, however, VR has expanded outside scientific environments and has shown a quick growth into a mainstream tool in the average consumer's home (according to CCS Insight, a projected 24 million devices by 2018; Lamkin, 2016); further, it is a viable consumer-ready device that allows users to become fully immersed in a realistic, interactive 3D-world (e.g., Occulus Rift, HTC Vive, Sony Playstation VR; Aczél, 2016).
Reliable Autonomic Measures
Psychophysiological measures can be used to assess the perceptual, cognitive, and emotional processes produced by physical stimuli. One standard tool is SCR, which measures autonomic activity in the nervous system and can be used to quantify response levels to stimuli. The reliability of SCR has been demonstrated in a plethora of previous studies (see for example Teghtsoonian and Frost, 1982; Lang et al., 1993; Löw et al., 2008). This method is, however, not available in people's homes and would practically be very difficult to measure remotely. Major hardware developers are moving toward an integration of eye tracking technology in commercially available VR headsets (e.g., FOVE Eye Tracking VR Headset, Tobii's VR4 for Vive Kit, SMI's HMD Eye Tracking for VR). For the last 10 years, eye tracking has been used increasingly to assess psychological mechanisms with notable accuracy (see Duchowski, 2007). Eye tracking (for review, see Trillenberg et al., 2004; Karatekin, 2007; Luna et al., 2008; Gredebäck et al., 2009) allows the measurement of pupil dilation, which is thought to index autonomic activity (Laeng et al., 2012). Pupil dilations and SCR are held to be comparable responses in the peripheral nervous system that are controlled by similar areas in the brain stem, such as the locus coeruleus (Laeng et al., 2012), and have previously been correlated with one another (Bradley et al., 2008; Wieser et al., 2009), although never in a VR environment.
Summary
While the demand for remote data collection and the in-home availability and reliability of VR are quite well established, the demonstration of reliable autonomic measures in VR is still lacking. Online platforms are capable of collecting data inexpensively and rapidly, but the level of experimental control and richness of the data collected through additional physiological measures can be drastically improved through VR. Eye-tracking measures are now commercially available in mobile VR headsets and provide access to physiologically measures remotely. To our knowledge, there have been no published studies that aim to assess the usability of integrating eye tracking within a virtual environment using VR, and no studies simultaneously measuring the pupillary response and SCR to VR stimuli. In his critical review of pupil methodology and measures, Aslin (2012) concluded with the statement “A final step in this process may, one day, be the use of virtual reality displays […] which would enable the experimenter to control the visual world” (p. 138). In the current study, we aim to establish pupil dilation as a reliable and robust autonomic measure in VR.
Current Aims
As stated earlier, we believe that the time is right to integrate VR with home testing and use pupil dilations to gain reliable psychophysiological measures on large groups of participants, creating the foundation for a Virtual Lab. To assess the validity of these claims, three steps need to be taken. (1) We need to establish that we can effectively and reliably measure both SCR responses and pupil dilations in a single VR-based paradigm; (2) these two dependent measures further need to be correlated; and (3) the responses measured by pupil dilation and skin conductance are independent of the physical locale but responsive to the experimental manipulations in the VR environment (i.e., reliable remote testing should be independent of the physical environment in order to control for contextual confounds). The following aims of the current study address each of these steps to empirically test the utility of a Virtual Lab.
To complete these steps, we measured autonomic responses to spiders when compared to balls and beetles using pupil dilation and SCRs in a VR setting. Unlike previous studies demonstrating a fear response to spiders via stimuli presented on a computer monitor (Miltner et al., 2005; Rinck et al., 2005; Rinck and Becker, 2006; Gerdes et al., 2008), a Virtual Lab allows the physical distance of the spider to be manipulated in a standardized way (near or far from the participant) offering a high degree of control over stimuli presentations. We have previously shown the robust increase in SCR to proximal objects displayed in immersive virtual reality (Rosén et al., 2017). Additionally, half of the participants were immersed in a photo-realistic virtual environment of the exact same real-world laboratory room that they were physically seated in. The other half of the participants were physically seated in another entirely different room. This allowed us to test whether physical locale of the experiment influenced autonomic responses, an assumption critical for remote testing to be feasible. We then aimed to replicate the results of the first experiment with a new group of participants and with additional beetle stimuli in order to assess robustness and reliability.
Methods
Experiment 1
Participants
Forty adults (22 females; mean age 24.63 years) participated in Experiment 1. Four of the participants were excluded from the pupil task in the final analysis due to insufficient data across trials (more than 30% data loss), three participants were excluded from the SCR task because more than 10% of the trials were lost, and an additional six participants did not have SCR data recorded. The final sample was 36 adults (19 females; mean age 25.04) for the pupil task and 31 adults for the SCR task (16 females; mean age 26.40). In order to participate, all participants provided informed consent and received a 10€ gift voucher. This research was supported by a grant from the Swedish Research Council. The study was conducted in accordance with the standards specified in the 1964 Declaration of Helsinki and approved by the local ethics committee.
Materials
We used an HTC Vive VR headset (www.vive.com), which has a display resolution of 2,160 × 1,200 (1,080 × 1,200 per eye), 90 Hz refresh rate, and 110 degrees field of view. The VR headset had a built-in Tobii Glasses 2 eye-tracker (www.tobii.com/tech/products/vr/), with 100-Hz gaze sampling and absolute pupil measurement. The experiment was run in the Unity game engine on a lab PC capable of running 3D graphics.
Stimuli
The stimuli and environment were digitally created by Goodbye Kansas Studios (http://goodbyekansasstudios.com/) using Maya (www.autodesk.com/products/maya/overview) and Mudbox (www.autodesk.com/products/mudbox/overview). The environment was a lab room that was set up identically in both the virtual and real environment (see Figure 1). The virtual environment was created using photogrammetry, a method that uses 3D scan technology and photography of real physical environments to create a photorealistic effect to the graphics. The properties of the virtual environment space were set in Unity to reflect real-world proportional units measured in meters.
Figure 1. Side-by-side comparison of the real and virtual environment (a) and examples of the stimuli (b), spider and ball (Experiment 1) and beetle stimuli (Experiment 2). Note that the images do not accurately reflect the color, luminance, and dynamic animations in the actual experiment.
The stimuli were digitally created blue and brown spiders and balls (see Figure 1), approximately 30 × 30 cm. Stimuli appeared and moved toward the participant across three phases. During the beginning of the first appearance (distance of 60 cm), the stimuli were approximately 28 visual degrees; during the far approach (distance of 30 cm), the stimuli was approximately 53 visual degrees, and approximately 113 visual degrees, during the near approach (distance of less than 10 cm). To avoid differences in pupil dilation due to color, the exact same texture from the spider was applied to the balls. Both the spiders and the balls had an animation for their movement (walking for spiders, forward momentum for ball), and an additional idle animation for the spider for added realism.
Procedure
Participants arrived to one of the two experiment rooms: the same room as the virtual environment or a different room. In both conditions, participants were seated in front of a table and the experimenter sat behind the participant out of view. Participants were fitted with the VR headset and SCR Ag/CI electrodes. Prior to testing, participants were first “placed” in the VR room environment for approximately 2 min in order to allow participants to adjust to the VR experience, as well as to simultaneously calibrate the eye-tracker using an automatic one-point calibration procedure to ensure proper recording of pupil data. In the virtual environment, there was a black box with a front facing flap on the table in front of the participant (approximately 120 cm from the participant) with a blue “X” in the center. The experimenter instructed the participant to remain seated and to remain looking at the blue “X” throughout the experiment. During the experiment, the flap on the box would open revealing either a ball or spider inside (hereafter “First Appearance”). After 2.5 s, the stimuli would move forward and stop approximately 30 cm from the participant (“Far Approach”). After standing idle for 1 s, the stimuli continued moving toward the participant (“Near Approach”) until it “fell” off the table and appeared to fall onto the participants' lap, while the lid of the box simultaneously closed. After a 4-s pause, the box would re-open and a new trial would begin. A total of 16 trials were presented (8 spider trials, 8 ball trials) with the stimuli randomized and counter-balanced across participants. Refer Figure 2 for the timing of each trial. The total time of the experiment was 12 min.
Figure 2. Time series illustrating the three distance periods and each corresponding baseline for pupil analyses. Time intervals were the same across all trials and conditions.
Recording and Analysis
Pupil Dilation
The dependent variable for each trial was the difference between mean pupil size during the three distance test periods (First Appearance, Far Approach, and Near Approach) and mean pupil size during the baseline period. The baseline period began 1,000 ms prior to the onset of each test period (see Figure 2). All trials were visually inspected for normal pupillary light reflex response. The analysis was performed in the open-source analysis program TimeStudio version 3.03 running in Matlab version 7.12 (www.timestudioproject.com; Nyström et al., 2016). We used a moving-average filter and gap interpolation across all data. The actual analysis, settings, and source code used for analyzing the data can be downloaded with uwid ts-674-f5e from the TimeStudio interface. We conducted a repeated measures ANOVA to examine pupil dilations to the different stimuli, at varying distances, and in the same or different physical room.
Due to concerns about the maximum mydriasis and miosis of the pupil while wearing a headset, and because this is one of the first studies to examine the pupil in VR, we checked that the baseline values of the pupil were within normal range (human pupil ranges in size from 7.5 to 8 mm at full mydriasis to 1.5–2 mm at full miosis, Alexandridis et al., 1985). We found that at across all baseline periods, there was an average baseline size of 3.73 mm (range of 3.55–3.87 mm) for all subjects, values comparable to other studies that have reported baseline pupil diameter (Privitera et al., 2010). We additionally performed a linear mixed model on the Wilkinson form to see whether baselines increased or decreased over time. There was a significant decrease in baseline values over trials (main effect of trial, estimate: −0.014287 mm/trial, p = 0.004), which suggests that the baseline returned after each trial, and also decreased a bit more. We interpret this as participants are more aroused during the experimental situation in the beginning of the session and then become habituated to the situation over time. Importantly, we did not find any difference in baseline values between conditions (p = 0.940).
Skin Conductance Responses
Skin conductance was recorded using the MP-150 BIOPAC system (BIOPAC Systems, Goleta, CA). Pre-gelled Ag/AgCl electrodes were placed on the left hand's palmar surface. The SCR signal passed through a high-pass hardware filter of 0.05 Hz and was analyzed with the Ledalab software package (Benedek and Kaernbach, 2010) implemented in Matlab (Mathworks Inc., Natick, MA). SCR was scored using the maximum phasic driver amplitude (Max.SCR) 1–4 s after each test period (see Figure 2) and then transformed (square root) and range corrected so that SCRs ranged from 0 to 1 (Lykken, 1972).
Data Analyses
We had two within-subjects independent variables of interest (Condition and Distance), and one between-subjects independent variable (Room). The outcome measures of interest included mean pupil changes and SCRs to the stimuli presentations. Separate repeated measures ANOVAs with Greenhouse-Geisser correction were used for the pupil and SCR measures to study the interaction between Condition, Distance, and Room, and paired t-tests were used to test the comparisons. Means and standard deviations are reported in Table 1. Our primary interest was the autonomic response in the near approach based on the previous findings that indicate increased autonomic response to proximal stimuli when compared to distant ones (Rosén et al., 2017); therefore, only those results are reported here. The results for the appearance and far approach can be found in the Supplementary Materials.
Table 1. Mean score (SD) at each distance (appearance, far approach, near approach) and stimuli condition (spider, ball) for the pupil and SCR measures in Experiment 1.
Experiment 2
We aimed to replicate and extend the findings from Experiment 1 in a separate experiment with a new sample and using new additional stimuli in order to assess robustness and reliability.
Participants
Eighteen adults participated (11 females; mean age 26.88 years). Three of the participants were excluded from the pupil task in the final analysis due to insufficient data across trials (more than 30% data loss) and one participant was excluded from the SCR due to incomplete data (more than 10% of the trials were missing). The final sample was 14 adults (9 females; mean age 25.78) for the pupil task and 16 adults for the SCR task (10 females; mean age 25). (Participants provided informed consent and received a 10€ gift voucher for participating).
Stimuli
The stimuli and environment were identical to Experiment 1, with the addition of a beetle stimulus. The beetle was a digitally created blue and brown beetle (see Figure 1), approximately 30 × 30 cm. The distances and visual degrees of the stimuli were the same as those used in Experiment 1. The spiders, balls, and beetles had an animation for their movement (walking for spiders and beetles and forward momentum for ball), and an additional idle animation for the spider and beetle. The size of the spider and beetle and the walking and idle animations were identical.
Procedure
The general procedure was identical to Experiment 1, in which trials including a beetle were added and with no room condition. A total of 18 trials were presented (6 spider trials, 6 ball trials, and 6 beetle trials) with the stimuli randomized and counter-balanced across participants. Refer Figure 2 for the timing of each trial. In total, the experiment took approximately 15 min.
Recording and Analysis
Recording and analysis measures and procedures were identical to Experiment 1. Means and standard deviations are reported in Table 2.
Table 2. Mean score (SD) at each distance (appearance, far approach, near approach) and stimuli condition (spider, beetle, ball) for the pupil and SCR measures in Experiment 2.
Results
Results From Experiment 1
Pupil Dilation
Results from the repeated measures ANOVA are shown in Table 3. There was a significant main effect of Condition where mean pupil dilation significantly differed between the stimuli, with significantly greater pupil dilation to the spider than to the ball t(35) = 2.87, p = 0.007, d = 0.97. There was also a significant main effect of Distance whereby there was overall greater pupil dilation in the Near Approach than Appearance, t(35) = −10.48, < 0.001, d = 3.54, and Far Approach t(34) = −6.83, p < 0.001, d = 2.34. There was a significant interaction effect between Condition and Distance, meaning the pattern of pupil dilation in each condition differed at each distance (see Figure 3a). When examining the Near Approach, there was significantly greater pupil dilation to the spider than to the ball, t(34) = 5.91, p < 0.001, d = 2.03. These changes in pupil size are comparable to significant changes observed in similar studies with both infants and adults (Gredebäck and Melinder, 2011; Hoehl et al., 2017; Hellmer et al., 2018).
Table 3. Results from the repeated measures ANOVAs with Greenhouse-Geisser correction for the pupil and SCR measures in Experiment 1 and Experiment 2.
Figure 3. Mean difference scores for the near approach comparing spider vs. ball for pupil dilation and SCR measures in Experiment 1 (a), spider vs. ball vs. beetle in experiment 2 (b), and scatterplot with the aggregated data from experiments 1 and 2 showing a significant correlation between the difference scores form the spider and ball during the near approach (c).
The main effect of Room was not significant (p = 0.704) and neither were the between-subjects interactions of Room with Condition and Distance (ps > 0.05), meaning that the pattern of pupil dilation between conditions and distance did not depend on the physical Room participants who were seated in during the experiment.
SCR
Results from the repeated measures ANOVA are shown in Table 3. Results revealed a main effect of Condition with significantly increased autonomic response for the spider than the ball, t(30) = 5.24, p < 0.001, d = 1.91. There was also a significant main effect of Distance with greater autonomic response in the Near Approach than Appearance, t(30) = −2.58, p = 0.015, d = 0.94. Since our primary interest was in the Near Approach and the interaction effect between Condition and Distance was marginally significant, we examined the Near Approach revealing a significantly greater SCR response to the spider than the ball t(30) = 4.57, p < 0.001, d = 1.67.
The main effect of Room was not significant (p = 0.187) and neither were the between-subjects interactions of Room with Condition and Distance (p > 0.05 for each), suggesting that the pattern of SCR between conditions and distance did not depend on the Room where the participants were present.
Results From Experiment 2
We conducted separate repeated measures ANOVAs with Greenhouse-Geisser correction for the pupil and SCR measures and examined the interaction between Condition and Distance. We used paired t-tests to test the contrasts. Our primary interest was again the autonomic response in the near approach. The results for the appearance and far approach can be found in the Supplementary Materials.
Pupil Dilation
Results from the repeated measures ANOVA are shown in Table 3. Results showed a significant main effect of Condition, where mean pupil dilation significantly differed between the stimuli with significantly greater pupil dilation to the spider than to the ball (p = 0.004), the spider than the beetle (p = 0.004), but not between the beetle and the ball (p = 0.139). There was also a significant main effect of Distance. When examining the Near Approach, there was significantly greater pupil dilation to the spider than to the ball (p < 0.001), the spider than to the beetle (p = 0.004), and the beetle than the ball (p = 0.005; see Figure 3b). There was a significant interaction effect between Condition and Distance, which means the pattern of pupil dilation in each condition differed at each distance.
SCR
Results from the repeated measures within-subjects ANOVA are shown in Table 3. Results revealed a significant main effect of significantly greater autonomic response to the spider than the ball (p = 0.001) and to the beetle than the ball (p < 0.001), but not between the spider and the beetle (p = 0.081). There was also a significant main effect of Distance. When examining the Near Approach, there was significantly greater SCR to the spider than to the ball (p < 0.001), the beetle than the ball (p < 0.001), but not the spider than the beetle (p = 0.110). There was no significant interaction effect between Condition and Distance.
Correlations
We combined the spider and the ball data from Experiment 1 and Experiment 2 for both the pupil and SCR responses and calculated difference scores between the spider and the ball (i.e., spider response minus ball response). Results from the correlation cannot make any claims about the nature of the autonomic response—we are only interested in the relationship between the responses from each measure. A correlation analysis revealed that during the Near Approach, the difference scores from the pupil were significantly correlated with the differences scores from the SCR response, r(40) = 0.32, p = 0.039 (see Figure 3c). All other correlations were non-significant (p's > 0.05).
General Discussion
We effectively and reliably showed physiological differences in autonomic responses in both SCR and pupil dilation measures, with greater autonomic response to the spider than the ball or beetle, regardless of whether participants were in the same virtual and physical environment, or in an entirely separate physical environment. The greater autonomic responses measured by SCR and pupil dilation in response to the spider are not only in the same direction across conditions but were also positively correlated. These differences occurred only when near to the participant in virtual space—a level of immersion that is unique to VR when compared to a monitor screen (Slater et al., 1994; Jones et al., 2008). It is important to note that we did not test distance independent of prior distance, and therefore, it is possible that the current results were blunted by habituation or effects of prior distances. Controlling for this would likely strengthen the current findings. These findings support and extend previous work that has assessed self-reported fear response to spiders in VR (Peperkorn et al., 2016). It is also consistent with previous visual tasks with both children and adults that show attentional bias toward spiders (e.g., Öhman and Mineka, 2001; LoBue, 2010; Devue et al., 2011), presumably due to spiders' recurrent and widespread threat throughout human evolution (Öhman, 1993). For the first time, this study represents that SCR and pupil dilations have been simultaneously recorded and correlated as an autonomic response to spiders, and also such a relationship has been measured in VR.
These findings demonstrate that both SCRs and pupil dilation can be effectively and reliably measured in a VR-based paradigm and appear to tap similar autonomic responses. Furthermore, the autonomic responses are response result of the experimental manipulations and not the physical room context, making reliable remote testing possible. Together, these findings support the notion that a virtual lab with high degree of experimental control can be used using commercially available tools available in people's homes.
The current Virtual Lab overcomes many of the concerns regarding pupil dilation measures (e.g., Aslin, 2012). One challenge has been that pupil size is also determined by low-level stimulus factors like luminance. We have addressed these concerns first through the technical aspect of VR controlling for luminance and a virtually constructed and controlled environment; and second, by correlating the pupil response with another well-established autonomic response. SCR provides a reliable physiological measure of an autonomic response in participants and has been well demonstrated in previous studies as a response to proximal threat, snakes, and spiders (Teghtsoonian and Frost, 1982; Löw et al., 2008). However, SCR is not a measure that can easily be tested in an individual's home without the necessary and expensive equipment, making pupil and eye-tracking measurements a much more convenient tool to tap into physiological responses in a Virtual Lab. One previous experiment showed pupil dilation in a VR decision-making task (Skulmowski et al., 2014); however the VR did not include any head-tracking and they did not control for luminance across conditions. The current study is the first to reliably use pupil dilation measures fully integrated with a VR headset and establish the utility of VR for physiological data collection. Together, we believe this demonstrates a proof-of-concept for a remote Virtual Lab in eye-tracking integration that could change the accessibility of experimental data for researchers.
Virtual Lab: Practical Considerations
We are just beginning to understand the potential of VR; however, an expanding tech market, previous research studies, and the results of the present study are all positive toward a Virtual Lab, which will be the next step in the future of research. There are, however, some practical issues and caveats to consider when establishing a Virtual Lab.
A current concern for many research fields is a bias in study sample demographics. For example, one meta-analysis revealed that psychology research tends to make conclusions about human nature based on samples taken solely from Western undergraduate students (Henrich et al., 2010). A risk with a Virtual Lab is that research would presumably be tapping a very similar demographic. Indeed, consumer reports suggest that the majority of consumers that purchase a VR system are males (85.7%; Stanton, 2016). However, unlike studies conducted at universities or in research labs, a Virtual Lab is intrinsically connected to an evolving and expanding worldwide tech market. Like other tech markets such as computers and the internet, as the device becomes more main stream the user demographics expand. When compared to the past, online data collection samples today have been shown to be relatively diverse with respect to gender, socioeconomic status, geographic region, and age (Gosling et al., 2004). A possible limitation in the current study was the relatively low number of males when compared to females in the sample. While the distribution of male and female participants is not equal, previous studies have not suggested that gender influences skin conductance and pupil dilation in response to threat (e.g., Partala and Surakka, 2003; Bianchin and Angrilli, 2012; Rosén et al., 2017).
Furthermore, the ease of remote testing with a Virtual Lab serves as a strength for reaching wider demographic samples. With a Virtual Lab, experimenters can remotely test participants across the world while remaining in their own lab, or alternatively bring the VR headset directly to homes or public spaces, all while still maintaining a controlled virtual lab environment. Visual and auditory distractions and differences in testing rooms and luminance would no longer be an issue when testing via VR. Bringing the virtual lab to participants would allow for unique and remote samples that would otherwise be difficult to test due to an inability to physically enter a lab.
While other papers have already engaged in a discussion regarding ethical issues for the implementation of VR in various contexts (e.g., see Whalley, 1995; Brey, 1999), we feel it is important to highlight these concerns here. One concern is the level of immersion and presence that participants may experience, and possible unintended effects on the participants engaging in a virtual environment. VR is considered an “embodied technology” with the ability to modify the feeling of presence (Riva et al., 2016). It can alter our experience of the body and space by altering the very cognitive factors regulating our experience of body and space (for in depth analysis, see Riva et al., 2015). Given VR is an embodied technology, it also opens questions regarding the morality of virtual behavior; further, the behaviors permitted in a virtual environment, which are not socially permitted in real societies, should be acceptable. Finally, there are ethical considerations as to the kinds of data that are collected and stored on a massive scale, and how this data can be secured. Many of these issues are not unique to VR and there are ongoing discussions in many areas of technological development like game design (see Richard Bartle's discussion of “Human Rights and Virtual Worlds”; Bartle, 2004), but nonetheless are important and increasingly relevant issues for researchers.
Virtual Lab: Future Directions
It is clear that VR headsets are becoming a common addition to households across the world. People play games and engage in virtual experiences regularly in their homes, allowing for the implementation of psychological tests where the information can be fed back to the experimenters and developers on-line. The possibilities of VR are evident by its quick growth into a mainstream tool in the average consumer's home and businesses and institutions, including military, aerospace, construction, automobile industries, entertainment, and popular news and media (Aczél, 2016). The information and data collected in a Virtual Lab experiment can be utilized by researchers and developers for workplace training, software development, marketing and advertising, optimizing game experiences based on arousal and interests of the player/user, and in medicine and treatment. The findings from the current paradigm could be further expanded by examining the physiological response toward spiders depending on levels of anxiety. In the future, it would be very interesting to test the differences between high and low anxious individuals in response to spiders using the virtual lab.
A Virtual Lab provides a useful and robust tool for measuring physiological responses of participants in a controlled virtual environment, where the stimuli can be presented in protection from environmental variation. The current study supports a robust Virtual Lab tool for massive remote testing that combines the strengths of both online testing and lab experiments, which is available through consumer devices, allowing pupil dilation measures in VR. Research on the capabilities and potential of VR is still in its infancy, but both previous and the present results are positive and suggest that a virtual lab is a possible next step in the future of research in a wide variety of fields and industries.
Ethics Statement
This study was carried out in accordance with the recommendations of Uppsala University with written informed consent from all subjects. All subjects gave written informed consent in accordance with the Declaration of Helsinki. The protocol was approved by the Uppsala University Ethics Review Board.
Author Contributions
All authors contributed to the design of the study. NL did programming and created the stimuli. JJ, JR, and GK carried out data collection. JJ, JR, PN, and GK conducted data analyses. JJ and JR wrote the main manuscript text. All authors reviewed and contributed to the final manuscript.
Conflict of Interest Statement
NL was employed by company Goodbye Kansas, Visual Effects Studio.
The other authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Acknowledgments
This research was supported by a grant from the Swedish Research Council (2014-1160) to FA.
Supplementary Material
The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/fnins.2018.00305/full#supplementary-material
References
Aresti-Bartolome, N., and Garcia-Zapirain, B. (2014). Technologies as support tools for persons with autistic spectrum disorder: a systematic review. Int. J. Environ. Res. Public Health 11, 7767–7802. doi: 10.3390/ijerph110807767
Aslin, R. N. (2012). Infant eyes: a window on cognitive development. Infancy 17, 126–140. doi: 10.1111/j.1532-7078.2011.00097.x
Benedek, M., and Kaernbach, C. (2010). Decomposition of skin conductance data by means of nonnegative deconvolution. Psychophysiology 47, 647–658. doi: 10.1111/j.1469-8986.2009.00972.x
Bianchin, M., and Angrilli, A. (2012). Gender differences in emotional responses: a psychophysiological study. Physiol. Behav. 105, 925–932. doi: 10.1016/j.physbeh.2011.10.031
Botella, C., Baños, R. M., Perpiñá, C., Villa, H., Alcañiz, M., and Rey, A. (1998). Virtual reality treatment of claustrophobia: a case report. Behav. Res. Ther. 36, 239–246. doi: 10.1016/S0005-7967(97)10006-7
Botella, C., Serrano, B., Baños, R. M., and Garcia-Palacios, A. (2015). Virtual reality exposure-based therapy for the treatment of post-traumatic stress disorder: a review of its efficacy, the adequacy of the treatment protocol, and its acceptability. Neuropsychiatr. Dis. Treat. 11, 2533–2545. doi: 10.2147/NDT.S89542
Bradley, M. M., Miccoli, L., Escrig, M. A., and Lang, P. J. (2008). The pupil as a measure of emotional arousal and autonomic activation. Psychophysiology 45, 602–607. doi: 10.1111/j.1469-8986.2008.00654.x
Brey, P. (1999). The ethics of representation and action in virtual reality. Ethics Inform. Technol. 1, 5–14. doi: 10.1023/A:1010069907461
Buhrmester, M., Kwang, T., and Gosling, S. D. (2011). Amazon's Mechanical Turk: a new source of inexpensive, yet high-quality, data? Perspect. Psychol. Sci. 6, 3–5. doi: 10.1177/1745691610393980
Bulling, A., and Gellersen, H. (2010). Toward mobile eye-based human-computer interaction. IEEE Pervas. Comput. 9, 8–12. doi: 10.1109/MPRV.2010.86
Carlin, A. S., Hoffman, H. G., and Weghorst, S. (1997). Virtual reality and tactile augmentation in the treatment of spider phobia: a case report. Behavi. Res. Ther. 35, 153–158. doi: 10.1016/S0005-7967(96)00085-X
Devue, C., Belopolsky, A. V., and Theeuwes, J. (2011). The role of fear and expectancies in capture of covert attention by spiders. Emotion 11, 768. doi: 10.1037/a0023418
Duchowski, A. T. (2007). Eye Tracking Methodology. Theory and Practice. London: Springer-Verlag London.
Ferrer-García, M., and Gutiérrez-Maldonado, J. (2012). The use of virtual reality in the study, assessment, and treatment of body image in eating disorders and nonclinical samples: a review of the literature. Body Image 9, 1–11. doi: 10.1016/j.bodyim.2011.10.001
Gerdes, A. B., Alpers, G. W., and Pauli, P. (2008). When spiders appear suddenly: Spider-phobic patients are distracted by task-irrelevant spiders. Behav. Res. Ther. 46, 174–187. doi: 10.1016/j.brat.2007.10.010
Goodman, J. K., Cryder, C. E., and Cheema, A. (2013). Data collection in a flat world: the strengths and weaknesses of Mechanical Turk samples. J. Behav. Decis. Making 26, 213–224. doi: 10.1002/bdm.1753
Gorini, A., and Riva, G. (2008). Virtual reality in anxiety disorders: the past and the future. Expert Rev Neurother. 8:215. doi: 10.1586/14737175.8.2.215
Gosling, S. D., Vazire, S., Srivastava, S., and John, O. P. (2004). Should we trust web-based studies? A comparative analysis of six preconceptions about internet questionnaires. Am. Psychol. 59:93. doi: 10.1037/0003-066X.59.2.93
Gredebäck, G., Johnson, S., and von Hofsten, C. (2009). Eye tracking in infancy research. Dev. Neuropsychol. 35, 1–19. doi: 10.1080/87565640903325758
Gredebäck, G., and Melinder, A. (2011). Teleological reasoning in 4-month-old infants: pupil dilations and contextual constraints. PLoS ONE 6:e26487. doi: 10.1371/journal.pone.0026487
Hellmer, K., Söderlund, H., and Gredebäck, G. (2018). Cover image. Dev. Sci. 21:e12668. doi: 10.1111/desc.12668
Henrich, J., Heine, S. J., and Norenzayan, A. (2010). Most people are not WEIRD. Nature 466, 29–29. doi: 10.1038/466029a
Hoehl, S., Hellmer, K., Johansson, M., and Gredebäck, G. (2017). Itsy bitsy spider…: infants react with increased arousal to spiders and snakes. Front. Psychol. 8:1710. doi: 10.3389/fpsyg.2017.01710
Holland, C., and Komogortsev, O. (2012). “Eye tracking on unmodified common tablets: challenges and solutions,” in Proceedings of the Symposium on Eye Tracking Research and Applications (Santa Barbara, CA: ACM), 277–280.
Hone-Blanchet, A., Wensing, T., and Fecteau, S. (2014). The use of virtual reality in craving assessment and cue-exposure therapy in substance use disorders. Front. Hum. Neurosci. 8:844. doi: 10.3389/fnhum.2014.00844
Jones, J. A., Swan, J. E. II., Singh, G., Kolstad, E., and Ellis, S. R. (2008). “The effects of virtual reality, augmented reality, and motion parallax on egocentric depth perception,” in Proceedings of the 5th Symposium on Applied Perception in Graphics and Visualization (Los Angeles, CA), 9–14.
Karatekin, C. (2007). Eye tracking studies of normative and atypical development. Dev. Rev. 27, 283–348. doi: 10.1016/j.dr.2007.06.006
Laeng, B., Sirois, S., and Gredebäck, G. (2012). Pupillometry a window to the preconscious? Perspect. Psychol. Sci. 7, 18–27. doi: 10.1177/1745691611427305
Lamkin, P. (2016), October 2. HTC Vive VR Headset Sales Revealed. Forbes.com. Available onlinbe at: https://www.forbes.com/sites/paullamkin/2016/10/21/htc-vive-vr-headset-sales-revealed/ (Accessed June 12, 2017).
Landers, R. N., and Behrend, T. S. (2015). An inconvenient truth: arbitrary distinctions between organizational, Mechanical Turk, and other convenience samples. Industr. Organ. Psychol. 8, 142–164. doi: 10.1017/iop.2015.13
Lang, P. J., Greenwald, M. K., Bradley, M. M., and Hamm, A. O. (1993). Looking at pictures – affective, facial, visceral, and behavioral reactions. Psychophysiology 30, 261–273. doi: 10.1111/j.1469-8986.1993.tb03352.x
Lindner, P., Miloff, A., Hamilton, W., Reuterskiöld, L., Andersson, G., Powers, M. B., et al. (2017). Creating state of the art, next-generation Virtual Reality exposure therapies for anxiety disorders using consumer hardware platforms: design considerations and future directions. Cogn. Behav. Ther. 46, 404–420. doi: 10.1080/16506073.2017.1280843
LoBue, V. (2010). And along came a spider: An attentional bias for the detection of spiders in young children and adults. J. Exp. Child Psychol. 107, 59–66. doi: 10.1016/j.jecp.2010.04.005
Löw, A., Lang, P. J., Smith, J. C., and Bradley, M. M. (2008). Both predator and prey: emotional arousal in threat and reward. Psychol. Sci. 19, 865–873. doi: 10.1111/j.1467-9280.2008.02170.x
Luna, B., Velanova, K., and Geier, C. F. (2008). Development of eye-movement control. Brain Cogn. 68, 293–308. doi: 10.1016/j.bandc.2008.08.019
Lykken, D. T. (1972). Range correction applied to heart rate and to GSR data. Psychophysiology 9, 373–379. doi: 10.1111/j.1469-8986.1972.tb03222.x
Milgram, P., and Kishino, F. (1994). A taxonomy of mixed reality visual displays. IEICE Trans. Inform. Syst. 77, 1321–1329.
Miltner, W. H., Trippe, R. H., Krieschel, S., Gutberlet, I., Hecht, H., and Weiss, T. (2005). Event-related brain potentials and affective responses to threat in spider/snake-phobic and non-phobic subjects. Int. J. Psychophysiol. 57, 43–52. doi: 10.1016/j.ijpsycho.2005.01.012
Nyström, P., Falck-Ytter, T., and Gredebäck, G. (2016). The TimeStudio Project: an open source scientific workflow system for the behavioral and brain sciences. Behav. Res. Methods 48, 542. doi: 10.3758/s13428-015-0616-x
Öhman, A. (1993). “Stimulus prepotency and fear learning: data and theory,” in The Structure of Emotion: Psychophysiological, Cognitive, and Clinical Aspects, eds N. Birbaumer and A. Öhman (Seattle, WA: Hogrefe & Huber), 218–239.
Öhman, A., and Mineka, S. (2001). Fears, phobias, and preparedness: toward an evolved module of fear and fear learning. Psychol. Rev. 108:483. doi: 10.1037/0033-295X.108.3.483
Opriş, D., Pintea, S., García-Palacios, A., Botella, C., Szamosközi, S., and David, D. (2012). Virtual reality exposure therapy in anxiety disorders: a quantitative meta-analysis. Depression Anxiety 29, 85–93. doi: 10.1002/da.20910
Parsons, T. D., and Rizzo, A. A. (2008). Affective outcomes of virtual reality exposure therapy for anxiety and specific phobias: a meta-analysis. J. Behav. Therapy Exp. Psychiatry 39, 250–261. doi: 10.1016/j.jbtep.2007.07.007
Partala, T., and Surakka, V. (2003). Pupil size variation as an indication of affective processing. Int. J. Hum. Comput. Stud. 59, 185–198. doi: 10.1016/S1071-5819(03)00017-X
Peperkorn, H. M., Diemer, J. E., Alpers, G. W., and Mühlberger, A. (2016). Representation of patients' hand modulates fear reactions of patients with spider phobia in virtual reality. Front. Psychol. 7:268. doi: 10.3389/fpsyg.2016.00268
Privitera, C. M., Renninger, L. W., Carney, T., Klein, S., and Aguilar, M. (2010). Pupil dilation during visual target detection. J. Vis. 10:3. doi: 10.1167/10.10.3
Rinck, M., and Becker, E. S. (2006). Spider fearful individuals attend to threat, then quickly avoid it: evidence from eye movements. J. Abnorm. Psychol. 115, 231–238. doi: 10.1037/0021-843X.115.2.231
Rinck, M., Reinecke, A., Ellwart, T., Heuer, K., and Becker, E. S. (2005). Speeded detection and increased distraction in fear of spiders: evidence from eye movements. J. Abnorm. Psychol. 114, 235–248. doi: 10.1037/0021-843X.114.2.235
Riva, G., Baños, R. M., Botella, C., Mantovani, F., and Gaggioli, A. (2016). Transforming experience: the potential of augmented reality and virtual reality for enhancing personal and clinical change. Front. Psychiatry 7:164. doi: 10.3389/fpsyt.2016.00164
Riva, G., Dakanalis, A., and Mantovani, F. (2015). Leveraging psychology of virtual body for health and wellness. The Handbook of the Psychology of Communication Technology. Chichester, UK: John Wiley and Sons.
Rosén, J., Kastrati, G., and Åhs, F. (2017). Social, proximal and conditioned threat. Neurobiol. Learn. Mem. 142(Pt B), 236–243. doi: 10.1016/j.nlm.2017.05.014
Schultheis, M. T., and Rizzo, A. A. (2001). The application of virtual reality technology in rehabilitation. Rehabil. Psychol. 46:296. doi: 10.1037/0090-5550.46.3.296
Skulmowski, A., Bunge, A., Kaspar, K., and Pipa, G. (2014). Forced-choice decision-making in modified trolley dilemma situations: a virtual reality and eye tracking study. Front. Behav. Neurosci. 8:426. doi: 10.3389/fnbeh.2014.00426
Slater, M., Spanlang, B., Sanchez-Vives, M. V., and Blanke, O. (2010). First person experience of body transfer in virtual reality. PLoS ONE 5:e10564. doi: 10.1371/journal.pone.0010564
Slater, M., Usoh, M., and Steed, A. (1994). Depth of presence in virtual environments. Presence 3, 130–144. doi: 10.1162/pres.1994.3.2.130
Stanton, T. (2016). Mobile Dominates the Online Reality of Virtual Reality Sales. Intelligence.slice.com. Available online at: http://intelligence.slice.com/blog/2016/virtual-reality-mostly-mobile (Accessed June 12, 2017).
Stewart, N., Ungemach, C., Harris, A. J., Bartels, D. M., Newell, B. R., Paolacci, G., et al. (2015). The average laboratory samples a population of 7,300 Amazon Mechanical Turk workers. Judgment Decis. Mak. 10, 479–491.
Teghtsoonian, R., and Frost, R. O. (1982). The effects of viewing distance on fear of snakes. J. Behav. Ther. Exp. Psychiatry 13, 181–190. doi: 10.1016/0005-7916(82)90002-7
Tomlinson, M., Solomon, W., Singh, Y., Doherty, T., Chopra, M., Ijumba, P., et al. (2009). The use of mobile phones as a data collection tool: a report from a household survey in South Africa. BMC Med. Inform. Decis. Making 9:51. doi: 10.1186/1472-6947-9-51
Trillenberg, P., Lencer, R., and Heide, W. (2004). Eye movements and psychiatric disease. Curr. Opin. Neurol. 17, 43–47. doi: 10.1097/00019052-200402000-00008
Whalley, L. J. (1995). Ethical issues in the application of virtual reality to medicine. Comput. Biol. Med. 25, 107–114. doi: 10.1016/0010-4825(95)00008-R
Keywords: virtual reality, eye tracking, pupil dilation, SCR, autonomic response
Citation: Juvrud J, Gredebäck G, Åhs F, Lerin N, Nyström P, Kastrati G and Rosén J (2018) The Immersive Virtual Reality Lab: Possibilities for Remote Experimental Manipulations of Autonomic Activity on a Large Scale. Front. Neurosci. 12:305. doi: 10.3389/fnins.2018.00305
Received: 15 February 2018; Accepted: 19 April 2018;
Published: 08 May 2018.
Edited by:
Erwin Lemche, King's College London, United KingdomReviewed by:
Vincenzo Provitera, Fondazione Salvatore Maugeri, Veruno (IRCCS), ItalyEugene Nalivaiko, University of Newcastle, Australia
Andreas Voss, Institut für Innovative Gesundheitstechnologien (IGHT), Germany
Copyright © 2018 Juvrud, Gredebäck, Åhs, Lerin, Nyström, Kastrati and Rosén. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Joshua Juvrud, joshua.juvrud@psyk.uu.se