Skip to main content

FOCUSED REVIEW article

Front. Neurosci., 15 September 2009
Volume 3 - 2009 | https://doi.org/10.3389/neuro.01.029.2009

Inducing illusory ownership of a virtual body

1
Institució Catalana Recerca i Estudis Avançats (ICREA), Barcelona, Spain
2
Facultat de Psicologia, Universitat de Barcelona, Barcelona, Spain
3
Department of Computer Science, University College London, London, UK
4
Institut d’Investigacions Biomèdiques August Pi i Sunyer, Barcelona, Spain
5
Department of Neuroscience, Karolinska Institute, Stockholm, Sweden
We discuss three experiments that investigate how virtual limbs and bodies can come to feel like real limbs and bodies. The first experiment shows that an illusion of ownership of a virtual arm appearing to project out of a person's shoulder can be produced by tactile stimulation on a person's hidden real hand and synchronous stimulation on the seen virtual hand. The second shows that the illusion can be produced by synchronous movement of the person's hidden real hand and a virtual hand. The third shows that a weaker form of the illusion can be produced when a brain–computer interface is employed to move the virtual hand by means of motor imagery without any tactile stimulation. We discuss related studies that indicate that the ownership illusion may be generated for an entire body. This has important implications for the scientific understanding of body ownership and several practical applications.

Introduction

This paper describes work that brings together two apparently different fields of research. The first is concerned with how the brain forms a representation of the body, so that we can distinguish what is part of ourselves and what is not. The second is concerned with the question of the conditions under which people act and respond to events and situations within virtual reality as if these were real, sometimes called the study of presence (Sanchez-Vives and Slater, 2005 ). If we could achieve a situation whereby a computer generated virtual body temporarily results in the illusion that it is your body, this would address problems of interest to both fields. First, it would give us some understanding about the mechanisms of ‘real’ body ownership and would also demonstrate the high plasticity of body representation. Second, responding to a virtual body as if it were your own body is perhaps the most powerful demonstration of presence in virtual reality. If we were able to accomplish this then we would have greater understanding of the factors necessary to produce and maintain the experience of presence, with implications for future virtual reality technology development.
In this paper we first review our work on the problem of the ownership of a virtual arm and hand, through a virtual reality replication of the rubber hand illusion. Second, we relate this to the wider problem of ownership of entire bodies.

The Rubber Hand Illusion

The rubber hand illusion was first described by Botvinick and Cohen (1998 ) – but see Tastevin (1937 ) for an early anecdotal account. It was shown that synchronous tapping on a person’s hidden real arm and an aligned visible rubber arm placed in front of them results in a feeling of ownership of the fake arm. For a review see Makin et al. (2008) .
The illusion that the rubber hand becomes owned by the person as if it were the real hand can occur as early as 10–20 s after the start of synchronous visual–tactile stimulation (Ehrsson et al., 2004 ; Lloyd, 2007 ), and is indicated by two different response measurements. The first is based on an ownership illusion questionnaire that has come to be fairly standard in the field which was introduced in the original Botvinick and Cohen (1998) article.
The second response measure is based on the idea that if the subject’s proprioceptive system has actually been fooled into locating the hand where the rubber hand is then they should blindly point towards it rather than towards the real hand when asked to point at their own hand (Botvinick and Cohen 1998 ). A verbal report of the felt position of the hand judged against a ruler has also been used (Tsakiris and Haggard, 2005 ; Tsakiris et al., 2006 ). The distance between the two locations is the proprioceptive drift, and the stronger the subjective illusion the greater this behavioural indication of the illusion. Typically, both the questionnaire and proprioceptive drift measures indicate that the illusion occurs, but when the tactile sensation on the real hand is not synchronous with the corresponding visual stimuli on the rubber hand then the illusion breaks down.
The amount of stimulation time used to induce the illusion varies considerably amongst the different experiments. In Botvinick and Cohen (1998) there was stimulation for 10 min in their first experiment, and up to 30 min in the second. Ehrsson and colleagues stimulated the hands for periods of 42 s (Ehrsson et al., 2004 ) or 88 s (Ehrsson et al., 2007 ). Ehrsson et al. (2007) reported that in those subjects where the illusion was evoked, the mean time to the onset of the illusion was 11.3 ± 7.0 s (SD) and in their 2007 paper it was 14.3 ± 9.1 s (SD). Lloyd stimulated the hands for periods of 60 s and concluded that ‘…the rubber hand illusion is a pervasive perceptual phenomenon, which can be elicited in less than 15 s in approximately eight out of ten people’. Armel and Ramachandran (2003) reported that their pilot experiments showed that a ‘compelling illusion’ was obtained after 2.5 min. Tsakiris and Haggard (2005) and later studies by Tsakiris typically used 4 min of stimulation. In our virtual reality experiment (Section ‘Experiment 1 – Visual–Tactile Synchrony’) we used 5 min. In a recent rubber hand illusion experiment we asked participants to indicate when they felt the illusion. The median waiting times of the 43 who experienced the illusion was 27 s (range 5–116 s and mean ± SD 43 ± 34 s).
The rubber hand phenomenon has been replicated several times. For example Armel and Ramachandran (2003) additionally showed that arousal, as measured by an increase in electrodermal activity, was associated with manipulations of the rubber hand that would normally cause pain. Indeed this objective evidence for the illusion was further substantiated with functional magnetic resonance imaging, demonstrating that the stronger the rubber hand illusion the stronger the threat-evoked neuronal responses in areas related to pain anticipation (Ehrsson et al., 2007 ). Another important observation was that for the illusion to work well it is required that the fake arm look like an arm (Tsakiris and Haggard, 2005 ), and that the rubber arm should be aligned with the orientation of the real arm (Ehrsson et al., 2004 ; Pavani et al., 2000 ). This suggests that the multisensory integration producing the illusion operates in hand-centered reference frames (Costantini and Haggard, 2007 ).
There have also been results from brain imaging studies identifying activity in multisensory areas, such as areas in the intraparietal sulcus and the ventral premotor cortex, associated with the illusion (Ehrsson et al., 2004 , 2005 , 2007 ) an activity that was greater when the rubber arm was aligned in parallel with the real hand (Ehrsson et al., 2004 ). Regarding the underlying mechanisms it has been suggested that the illusion happens as the result of the integration and interpretation of visual, tactile and position sense (proprioceptive) signals (Botvinick and Cohen, 1998 ; Ehrsson et al., 2004 ; Makin et al., 2008 ; Tsakiris and Haggard, 2005 ). Neurons in multisensory areas that integrate this type of information could implement the necessary neuronal computations causing changes in body ownership (Ehrsson et al., 2004 , 2005 ; Makin et al., 2008 ; Tsakiris et al., 2006 ).

The Virtual Hand Illusion

Experiment 1 – Visual–Tactile Synchrony

In Slater et al. (2008a) we showed that the illusion could be reproduced in virtual reality. Instead of a rubber arm, participants saw a completely virtual arm projecting out of their right shoulder. This was achieved with a back-projected screen onto which a stereo image of an arm was rendered, which together with head-tracking gives the powerful illusion that the virtual arm is attached to the shoulder and projecting forward in space. The real hand was hidden behind a screen, and the room was darkened. A second six degrees of freedom tracked device (a Wand) was employed where its movements in real space were replicated by the movements of a small yellow ball in the 3D virtual space. When the experimenter touched the real hand of the subject with the Wand, the subject would see the virtual ball touch the virtual hand, registered in the same place on the virtual hand. In this way synchronous visual and tactile stimuli could be applied to the virtual and real hand (Figure 1 A). The asynchronous stimulation in the control condition was achieved by using pre-recorded movements of the virtual ball. Using this setup we compared the responses between two groups of volunteers, with 21 participants in the synchronous and 20 in the asynchronous condition. The specific questions we used to indicate the illusion were:
Figure 1. Four experiments on the virtual arm illusion. (A) Visual–tactile correlations – the experimenter touches the real hand with a wand and the participant sees the virtual ball touch the virtual hand. (B) Visual, motor and proprioceptive correlations – the participant wears a data glove and moves his fingers and hand and the virtual hand moves. (C) Adding shadows for the ball and arm. (D) Using a brain–computer interface with cued motor imagery – the arrows point to the left or the right as cues for the motor imagery.
1. Sometimes I had the feeling that I was receiving the hits in the location of the virtual arm.
2. During the experiment there were moments in which it seemed as if what I was feeling was caused by the yellow ball that I was seeing on the screen.
3. During the experiment there were moments in which I felt as if the virtual arm was my own arm.
Each question was rated by the participants using a 7-point Likert scale, with 1 meaning ‘totally disagree’ and 7 ‘totally agree’. There were six other questions that are normally thought of as control questions, not indicating the illusion, but see the discussion about these in Section 4.2 of Slater et al. (2008a) . The illusion was produced both with respect to the questionnaire based measure and with respect to proprioceptive drift – where the mean drift was significantly greater for the synchronous condition compared with the asynchronous, and where the drift was significantly greater than 0 for the synchronous condition, but not for the asynchronous. To give an idea of the differences between the synchronous and asynchronous conditions, the medians of questions 1–3 were each 6 for the synchronous condition (with interquartile ranges 2, 1 and 2.25), and the medians were 1.5, 1 and 2 (with IQRs 1, 1 and 2.5) for the asynchronous condition. Using Mann–Whitney U tests all three significance levels for the difference between the synchronous and asynchronous conditions were less than 0.00025. The median proprioceptive drift in the synchronous condition was 30 mm (IQR 80 mm) and for asynchronous condition the corresponding values were 0 mm (and 30 mm). The Mann–Whitney U test gives a significance level of 0.0017 for the difference between these. Taking the results for both groups together there is a significant positive correlation between the proprioceptive drift and the mean of questions 1–3 above (n = 391, r = 0.41, P < 0.011, and residual errors not significantly different from Normal).
One of the advantages of a virtual reality representation is that it is possible to easily go beyond what is feasible in the physical world. Here, after the 5 min of stimulation (synchronous or asynchronous) the virtual arm was programmed to rotate (supination), and then rotate back again (pronation), for a total of 12 s. Our interest was to investigate whether the sight of the owned virtual arm moving would recruit automatic postural adjustment mechanisms, for example, to stabilise the arm. In this experiment we recorded electrical activity from the arm muscles using electromyography (EMG), and compared the EMG signal while the virtual arm was moving to a period before it was moving. Specifically the muscle activation was measured as the number of onsets, where an onset occurs when a rectified and filtered version of the raw signal stays above a threshold for at least 25 ms (Di Fabio, 1987 ). We found that in the synchronous condition only, there was a positive correlation between the subjective strength of the illusion as measured by the mean of the three questions above and the number of EMG onsets between 4 and 6 s after the arm started rotating. This therefore provided physiological evidence that movements in the virtual arm induced motor activity in the real arm in the synchronous condition.

Experiment 2 – Visual–Motor Synchrony

Having demonstrated that visuo-tactile correlations can induce an illusion of ownership of a virtual arm, we then explored whether this illusion can be induced in the absence of tactile stimulation – see also Dummer et al. (2009) and Tsakiris et al. (2006) . We carried out an experiment to investigate whether the virtual arm illusion can be induced by active movements of the fingers and hand (Sanchez-Vives et al. in preparation with a preliminary report by Slater et al., 2008b ). There were 14 male participants in this within-groups counter-balanced experimental design. The illusion related questions were:
1. I sometimes felt as if my hand was located where I saw the virtual hand to be.
2. Sometimes I felt that the virtual arm was my own arm.
Here the participants wore a data glove that detects hand and finger positions and transmits real-time data to the computer that controls the display of a virtual hand (Figure 1 B). Only when the movement of the virtual hand was synchronous with the movement of the participant’s real hand there was an ownership illusion. This was indicated by questionnaire response (the two questions above) and proprioceptive drift (using the method introduced by Botvinick and Cohen, 1998 ). The fact the illusion could be induced by active movements and congruent visual feedback is important for virtual reality applications where participants will need to interact with environmental objects.

Experiment 3 – Using a Brain–Computer Interface

We carried out a further experiment but without any tactile stimulation or overt movements (Perez-Marcos et al., 2009 ). Here the participants had the task to open and close their virtual hand through a brain–computer interface (BCI). This used a cued motor imagery paradigm (Pfurtscheller and Neuper, 2001 ) on which participant had been previously trained (Figure 1 D). There were two conditions – in the synchronous one the hand opened and closed as a function of the participant’s motor imagery. In the second – asynchronous – condition the hand opened and closed independently of the subject’s motor imagery. In the synchronous condition, but not in the asynchronous, there was a sense of ownership of the virtual hand. After the 5 min of BCI control (synchronous or asynchronous) of the arm, the virtual arm and table suddenly fell and the EMG recordings showed that there was greater muscle activity in the arm compared to an earlier reference period before the arm fell – but only for the synchronous condition. However, there was no proprioceptive drift in either condition. This may suggest that actual sensory feedback (touch or proprioceptive feedback) is necessary for recalibration of position sense and the elicitation of a full-blown virtual hand illusion. Alternatively, mental imagery may not be as potent in inducing the illusion as actual stimulation. Future experiments are needed to clarify to what degree virtual limbs can be owned by BCI control alone.

The Virtual Body

To what extent can the multisensory correlations employed to produce the virtual hand illusion generalise to the whole body? The evidence is beginning to point towards an affirmative answer to this question – that the illusion of ownership of a virtual body may be generated. There is both indirect and direct evidence for this. In Ehrsson (2007) a setup was employed to give people the illusion that they were behind their real bodies. Subjects wore a set of head-mounted displays that displayed real-time stereoscopic images from two cameras located behind where they were actually seated – thus shifting their visual ego-center to behind themselves. The experimenter was standing just behind the participant and the participant could see where they were sitting in the room and identify the experimenter standing behind them just next to them. The experimenter then used a stick to tap their chest (out of sight) while tapping underneath the location of the cameras. The felt tapping was either synchronous with the visual jabbing movements towards a point beneath the cameras, or asynchronous. In the synchronous condition subjects reported a strong illusion of being behind their physical bodies as judged by the questionnaire responses, for example ‘I experienced that I was located at some distance behind the visual image of myself, almost as if I were looking at someone else’ (Supplementary Figure 1, Ehrsson, 2007 ). People also experienced that the scientist was standing in front of them, i.e. there had been a change in the perceived self-location. This finding was reinforced by skin conductance responses that correlated with an attack on their ‘phantom body’ location in the synchronous but not in the asynchronous condition. Thus this is evidence that the sense of one’s body place can be dislocated to a position which is different from the body’s veridical position, and is therefore indirect evidence for the idea that a virtual body might become felt as one’s own.
More direct evidence has come from Petkova and Ehrsson (2008) , who employed cameras attached to the head of a manikin that was looking down on the manikin’s body. Again the video-signals from these cameras were presented in real time to the participant who was wearing a set of head-mounted displays. Now looking down at themselves subjects would see the manikin body in a similar location where their own body would be. Synchronous tapping on the stomach of the manikin and the real stomach resulted in a strong illusion of ownership of the entire body (as evidenced by the questionnaire responses), which was again confirmed by augmented skin conductance responses in correspondence to physical attacks on different body parts of the manikin in the synchronous but not in the asynchronous tapping condition. This suggests that entire bodies can be owned and that ownership of one stimulated body part automatically enhance ownership of other seen parts of the body.
A similar full body experiment was reported by Lenggenhager et al. (2007) . In the critical experiment the participants looked at a body presented a few meters in front of their selves through a head-mounted display. Thus the participants saw the back of the body, and when the experimenter stroked them on their back, they would see this stroking on the back at the distant body location. This resulted in the reported sense of being at the location of the body in front, and a version of the proprioceptive drift measure provided a further verification. In this case there was a reported projection of the sense of touch and self-localisation to a body observed from a third-person perspective, which is different from the experiments by Ehrsson (2007) and Petkova and Ehrsson (2008) where the owned artificial body was always perceived from first person perspective. To what extent the reported self attribution in these two set of experiments engage common or different perceptual mechanisms is still an open question (see Science E-letters for further discussion2). However, Lenggenhager et al. (2009) recently reported an experiment that directly compared the two paradigms and found evidence to suggest that self-localisation is strongly influenced by where the correlated visual–tactile event is seen to occur.

Discussion

The experiments reviewed in this article strongly suggest that virtual limbs and bodies in virtual reality could be owned by participants just as rubber hands can be perceived as part of one’s body in physical reality. Furthermore, the experimental findings suggest that ownership of virtual limbs and bodies may engage the same perceptual, emotional, and motor processes that make us feel that we own our biological bodies. To what extent this ‘virtual body illusion’ works when the movements of the simulated body are controlled directly by the participants thoughts, via BCI control, is an important emerging area for future experiments.
The visual realism of the virtual arm and the arm’s environment does not seem to play an important role for the induction of the illusion. In our laboratory we have seen the illusion work well with many different types of simulated hands. This is similar to the traditional rubber hand illusion which does not seem to depend on the physical similarity between the rubber hand and the person’s real hand – anecdotal observations; see also (Longo et al., 2009 ). Further, adding realism to the simulation by adding shadows (Figure 1 C) did not enhance the ownership illusion (Perez-Marcos et al., 2007 ), unpublished results. These observations would fit with physiological properties of cells in premotor and intraparietal cortices which are involved in the fast localisation of limbs in space (Graziano, 1999 ; Graziano et al., 2000 ), but not involved in visual object recognition and the fine analysis of visual scenes. This realisation is important for the development of virtual reality applications because it means that one is not restricted to ultra-realistic simulations and high definition visual displays.
Virtual reality additionally provides power to investigate these illusions at the whole body level. In Figure 2 we show an example of what can be seen when someone wears a tracked head-mounted display, looks down, and sees a virtual body in place of their real one. The very act of looking down, changing head orientation in order to gaze in a certain direction, with the visual images changing as they would in reality is already a powerful clue that you are located in the virtual place that you perceive. We argue elsewhere that multisensory contingencies that correspond approximately to those employed to perceive physical reality provide a necessary condition for the illusion of being in the virtual place (Slater 2009 ). Now imagine that you move, and the virtual body moves in correspondence with your movements, or you see something touch your virtual body and you feel the touch in the corresponding location in your real body. These events add significantly to the reality of what is being perceived – not only are you in the virtual place, but you also have the illusion that the events occurring are real – therefore increasing the likelihood that you would respond realistically to virtual events and situations.
Figure 2. Looking down at your virtual body. A seated participant wears a head-mounted display, and looks down to see a virtual body in place of his real one.

Future Perspective

BCI control of owned virtual bodies will probably have many important clinical and industrial applications, for example in the development of the next-generation BCI applications for totally paralysed individuals. These people would in principle be able to control and own a virtual body and engage in interactions in simulated environments. The first attempt in this direction (Experiment 3; Perez-Marcos et al., 2009 ) suggests that this dream might have a chance of success. When the motor imagery resulted in the expected opening and closing of the virtual hand then the ownership illusion and motor recruitment occurred (but not proprioceptive drift). The fundamental question here is whether a correlation between intentions of movement and pure visual feedback, in the absence of any tactile or proprioceptive feedback, is sufficient to induce the rubber hand illusion and produce recalibration of visual, tactile and proprioceptive representations. If so, this would demonstrate that multisensory recalibration could occur as a result of internal simulation of action and its sensory consequences. This issue is not fully settled yet, given that in Perez-Marcos et al. the illusion of ownership did not go along with proprioceptive drift. Future experiments whereby the participants can execute different types of virtual hand movements via so called ‘un-cued’ BCI may be a promising avenue for future experiments of this sort.

Conflict of Interest Statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

The research described in this project was funded by the European 6th Framework Future and Emerging Technologies Integrated Project PRESENCCIA Contract Number 27731, and in part by the Spanish Ministry of Science and Innovation. HHE was also supported by the Swedish Medical Research Council and the Swedish Foundation for Strategic Research.

Key Concepts

Virtual Reality: Sensory data generated by a computer system may be perceived as physical reality, especially when perceptionis enabled by use of the body in a manner similar to physical reality. The system ideally displays in all sensory modalities; fully encloses the person in these displays; tracks head position and orientation but also the movements of the whole body, determining the visual stereo and spatialised auditory displays as a function of this tracking.
Presence: Virtual reality may generate the illusion of being in the rendered virtual place. When contingent events in the virtual world apparently relate directly to the participant, then further there is the illusion that what is occurring is real. Under these conditions participants tend to act and respond to the virtual reality as if it were real.
Rubber hand illusion: An illusion that may be induced by an experimenter tapping a person’s real but hidden hand, while synchronously tapping a rubber hand placed in a visible and plausible position in front of the subject. The real hand is hidden behind a screen. After a short period of such stimulation the person has the illusion that the rubber arm is their arm and feels the taps from the location of the rubber hand.
Ownership illusion questionnaire: Botvinick and Cohen (1998) used nine questions. Three questions indicated the rubber hand illusion: ‘It seemed as if I were feeling the touch of the paintbrush in the location where I saw the rubber hand touched’; ‘It seemed as though the touch I felt was caused by the paintbrush touching the rubber hand’; ‘I felt as if the rubber hand were my hand’. The other six questions are typically used as controls.
Proprioceptive drift: Before the stimulation for the rubber hand illusion the participants are instructed to close their eyes and point (under the table on which their hand is resting) to the center of where they feel their hand to be, and to repeat this after the stimulation. The distance between these two locations is the proprioceptive drift.
Brain–computer interface (BCI): The electrical signals produced by the brain are analysed in real time and used to control external devices or events. Typically there is a training period where a computer programme learns the association between certain features of the brain recorded activity and specific thoughts. Later, events are triggered whenever the computer programme recognises which of a particular set of these features has been generated.
Cued motor imagery: Participants are asked to produce a particular type of thought in response to a cue. In this case the thought is imagination of limb movement. For example, in our BCI experiment, participants were cued to think of moving their left hand in order to close the virtual hand, and f moving their right foot in order to open the hand.

References

Armel, K. C., and Ramachandran, V. S. (2003). Projecting sensations to external objects: evidence from skin conductance response. Proc. R. Soc. Lond., B, Biol. Sci. 270, 1499–1506.
Botvinick, M., and Cohen, J., (1998). Rubber hands ‘feel’ touch that eyes see. Nature 391, 756–756.
Costantini, M., and Haggard, P. (2007). The rubber hand illusion: sensitivity and reference frame for body ownership. Conscious. Cogn. 16, 229–240.
Di Fabio, R. (1987). Reliability of computerized surface electromyography for determining the onset of muscle activity. Phys. Ther. 67, 43–48.
Dummer, T., Picot-Annand, A., Neal, T., and Moore, C. (2009). Movement and the rubber hand illusion. Perception 38, 271–280.
Ehrsson, H. H. (2007). The experimental induction of out-of-body experiences. Science 317, 1048–1048.
Ehrsson, H. H., Holmes, N. P., and Passingham, R. E. (2005). Touching a rubber hand: feeling of body ownership is associated with activity in multisensory brain areas. J. Neurosci. 25, 10564–10573.
Ehrsson, H. H., Spence, C., and Passingham, R. E. (2004). That’s my hand! Activity in premotor cortex reflects feeling of ownership of a limb. Science 305, 875–877.
Ehrsson, H., Wiech, K., Weiskopf, N., Dolan, R., and Passingham, R. (2007). Threatening a rubber hand that you feel is yours elicits a cortical anxiety response. Proc. Natl. Acad. Sci. USA 104, 9828.
Graziano, M. S. (1999). Where is my arm? The relative role of vision and proprioception in the neuronal representation of limb position. Proc. Natl. Acad. Sci. USA 96, 10418–10421.
Graziano, M. S., Cooke, D. F., and Taylor, C. S. (2000). Coding the location of the arm by sight. Science 290, 1782–1786.
Lenggenhager, B., Mouthon, M., and Blanke, O. (2009). Spatial aspects of bodily self-consciousness. Conscious. Cogn. 18, 110–117.
Lenggenhager, B., Tadi, T., Metzinger, T., and Blanke, O. (2007). Video ergo sum: manipulating bodily self-consciousness. Science 317, 1096–1099.
Lloyd, D. (2007). Spatial limits on referred touch to an alien limb may reflect boundaries of visuo-tactile peripersonal space surrounding the hand. Brain and Cogn. 64, 104–109.
Longo, M. R., Schuur, F., Kammers, M. P., Tsakiris, M., and Haggard, P. (2009). Self awareness and the body image. Acta Psychol. (Amst.). doi: 10.1016/j.actpsy.2009.02.003.
Makin, T., Holmes, N., and Ehrsson, H. (2008). On the other hand: Dummy hands and peripersonal space. Behav. Brain Res. 191, 1–10.
Pavani, F., Spence, C., and Driver, J. (2000). Visual capture of touch: out-of-the-body experiences with rubber gloves. Psychol. Sci. 11, 353–359.
Perez-Marcos, D., Slater, M., and Sanchez-Vives, M. V. (2007). Illumination realism in the virtual hand illusion. In Rovereto Workshop on Body Representation, Rovereto, Italy.
Perez-Marcos, D., Slater, M., and Sanchez-Vives, M. V. (2009). Inducing a virtual hand ownership illusion through a brain–computer interface. Neuroreport 20, 589–594.
Petkova, V. I., and Ehrsson, H. H. (2008). If I were You: perceptual illusion of body swapping. PLoS ONE 3, e3832. doi: 10.1371/journal.pone.0003832.
Pfurtscheller, G., and Neuper, C. (2001). Motor imagery and direct brain–computer communication. Proc. IEEE 89, 1123–1134.
Sanchez-Vives, M. V., and Slater, M. (2005). From presence to consciousness through virtual reality. Nat. Rev. Neurosci. 6, 332–339.
Slater, M. (2009). Place illusion and plausibility can lead to realistic behaviour in immersive virtual environments. Philos. Trans. R. Soc. Lond., B, Biol. Sci. (in press).
Slater, M., Pérez-Marcos, D., Ehrsson, H. H., and Sanchez-Vives, M. V. (2008a). Towards a digital body: the virtual arm illusion. Front. Hum. Neurosci. doi: 10.3389/neuro.09.006.2008.
Slater, M., Spanlang, B., Frisoli, A., and Sanchez-Vives, M.V. (2008b). Virtual hand illusion induced by visual-proprioceptive and motor correlations. I Congresso Ibro-Larc de Neurociencias da America Latina, Caribe, i Peninsula Iberica. j.
Tastevin, J. (1937). En partant de l’expérience d’aristote les déplacements artificiels des parties du corps ne sont pas suivis par le sentiment de ces parties ni par les sensations qu’on peut y produire. L’encéphale 32, 57–84.
Tsakiris, M., and Haggard, P. (2005). The rubber hand illusion revisited: visuotactile integration and self-attribution. J. Exp. Psychol. Hum. Percept. Perform. 31, 80–91.
Tsakiris, M., Prabhu, G., and Haggard, P. (2006). Having a body versus moving your body: how agency structures body-ownership. Conscious. Cogn. 15, 423–432.
Keywords:
rubber hand illusion, body ownership, virtual reality, presence
Citation:
Slater M, Pérez Marcos D, Ehrsson H and Sanchez-Vives MV (2009). Inducing illusory ownership of a virtual body.Front. Neurosci. 3,2:214- 220.doi: 10.3389/neuro.01.029.2009
Received:
03 May 2009;
 Paper pending published:
28 June 2009;
Accepted:
14 July 2009;
 Published online:
15 September 2009.

Edited by:

Maurizio Corbetta, Washington University, USA

Reviewed by:

Olaf Blanke, University of Geneva, Switzerland; Ecole Polytechnique Fédérale de Lausanne, Switzerland
Maurizio Corbetta, Washington University, USA
Copyright:
© 2009 Slater, Perez-Marcos, Ehrsson and Sanchez-Vives. This is an open-access article subject to an exclusive license agreement between the authors and the Frontiers Research Foundation, which permits unrestricted use, distribution, and reproduction in any medium, provided the original authors and source are credited.
*Correspondence:
Mel Slater, EVENT Lab Universitat de Barcelona Facultat de Psicologia, Departament de Personalitat, Avaluació i Tractaments Psicològics, Campus de Mundet - Edifici Teatre, Passeig de la Vall dquotidnHebron 171, 08035 Barcelona Spain. e-mail: melslater@ub.edu
Download