Skip to main content

REVIEW article

Front. Neurosci., 13 December 2019
Sec. Neuroprosthetics
This article is part of the Research Topic Embodying Tool Use: from Cognition to Neurorehabilitation View all 23 articles

The Embodiment of Objects: Review, Analysis, and Future Directions

  • 1Department of Philosophy, Western University Canada, London, ON, Canada
  • 2Rotman Institute of Philosophy, Western University Canada, London, ON, Canada
  • 3Brain and Mind Institute, Western University Canada, London, ON, Canada

Here we offer a thorough review of the empirical literature on the conditions under which an object, such as a tool or a prosthetic (whether real or virtual), can be experienced as being in some sense a part or extension of one’s body. We discuss this literature both from the standpoint of the apparent malleability of our body representations, and also from within the framework of radical embodied cognition, which understands the phenomenon to result not from an alteration to a representation, but rather from the achievement of a certain kind of sensory/motor coupling. We highlight both the tensions between these frameworks, and also areas where they can productively complement one another for future research.

Introduction

Is it possible to “embody” external objects, such as prosthetics or tools, that is, to treat or regard them as in some important sense actually part of our bodies? Most of the literature concludes that people can extend the borders of the physical body to temporarily incorporate different prosthetics, such as rubber hands, into their body image (i.e., their conscious beliefs regarding their bodies; see Botvinick and Cohen, 1998; Ehrsson et al., 2004, 2005, 2008; Tsakiris and Haggard, 2005; Tsakiris et al., 2006; Marasco et al., 2011; D’Alonzo and Cipriani, 2012; D’Alonzo et al., 2014; Collins et al., 2016), and certain external objects, such as tools, into their body schema (i.e., their unconscious knowledge of their bodies and its capacities; see Cardinali et al., 2009b, 2012; Sposito et al., 2012; Baccarini et al., 2014; Garbarini et al., 2015). Several studies indicate that there are surprisingly few constraints on incorporating objects into either of these two kinds of body representations.1 In fact, along with using real life tools, even imagining using tools can cause them to be incorporated into the body schema (Baccarini et al., 2014).

In this review, we highlight neuropsychological research and the numerous illusion studies on neurotypical individuals that explore embodiment, including those that use the rubber hand paradigm and/or virtual reality setups. The rubber hand illusion has been a paradigm widely used in studies of the relationship between the body image and the body schema. In the rubber hand illusion (detailed further in the next sections) an experimenter simultaneously touches a suitably positioned fake hand and the participant’s real hand hidden from view. The effect is participants feel the rubber hand to be their own. Virtual reality environments have also been used to investigate embodiment phenomena, and offer flexibility to address how and to what extent objects and tools are experienced and incorporated into our body representations. It is flexible in the sense that different virtual environments can be created to test the incorporation of objects and tools into our body representations and virtual body parts and bodies can be manipulated differently than in other non-virtual reality experimental setups. It also allows for one’s “objective body to remain in the real world, while one’s phenomenal body can be projected into terminal reality” (Murray and Sixsmith, 1999). Namely, that while one is immersed in a virtual environment, at least several aspects of her phenomenal body (e.g., the visually perceived body) are part of the virtual environment: one visually perceives whatever body she has in the virtual environment and not her body in the real world.

In this paper, first we outline two competing frameworks for understanding object and tool embodiment and review the relevant literature of embodiment of objects and tools from the point of view of these two fundamental frameworks. In Sections “Embodying Objects” and “Embodying Tools,” respectively, we analyze the way object and tool embodiment are understood in terms of body representations. In both cases we analyze experimental paradigms such as the rubber hand illusion and the use of virtual environments. In Section “Radically Embodied Tools and Objects,” we analyze object and tool embodiment from the specific point of view of radical embodiment. In all cases we pay special attention to the deeper theoretical commitments and the experimental methodology of the different frameworks. Importantly, our aim is not to critically analyze one of these two frameworks from the point of view of the other one, but to review them and to explore their compatibilities and incompatibilities.

Embodiment

Writ large, the embodiment of objects and tools may be defined as the sense that those objects and tools has become “part of us” in a similar way that our limbs or our fingers are parts of us. More specifically, the embodiment of objects and tools includes the sense of ownership, the sense of agency, and the sense of self-location (Kilteni et al., 2012), which often are characterized as affective, motor, and spatial embodiment, respectively (de Vignemont, 2011).2 We turn to them now to offer a brief introduction. The main concepts will be revisited and elaborated in later sections.

Affective, Motor, and Spatial Embodiment

The sense of ownership or affective embodiment refers to a situation where an individual shows the same affective reactions for the object as for their own body (de Vignemont, 2011). In this sense, affective embodiment is one of the fundamental components of the embodiment of objects and tools: the first step toward such an embodiment is the actual sense that a given object or tool is part of one’s body. Measures of affective embodiment involve evaluating behavioral and physiological responses in situations in which an object may be said to be embodied.

The sense of agency or motor embodiment corresponds to the situation where an object or tool “is processed in the same way as a part of one’s body for motor tasks” (de Vignemont, 2011, p. 87). Measures of motor embodiment are more often used in virtual reality and tool embodiment studies. If “the motor system takes the properties of [a tool] into account as properties of the effector in planning, then [it] is embodied” (de Vignemont, 2011, p. 86). Underlying the sense of agency is a modulation of the body schema. For instance, after tool use there can be consequences on free hand movement kinematics (Cardinali et al., 2009b).

The sense of self-location or spatial embodiment involves a bodily frame, an external frame, and a peri-personal frame (de Vignemont, 2011). The bodily frame is the body space’s boundaries. For instance, if an object “is taken into account by the representation of the body space, by replacing a missing body part, by adding a new body part, or by stretching an existing body part, then [it] can be said to be embodied” (de Vignemont, 2011, p. 85). Moreover, localizing “bodily sensations in [an object] shows that [it] is taken into account by the representation of the body space and that [it] is embodied” (Ibid.). For instance, amputees can sometimes feel things in contact with their prosthetic (Murray, 2004). The external frame is represented the mis-localization of a bodily object toward a non-bodily object (aka proprioceptive drift). For instance, if “the location of [an object] within the external frame is processed in the same way as the location of a part of one’s body, then [it] is embodied” (de Vignemont, 2011, p. 85). Lastly, the peri-personal frame “is the frame of the space immediately surrounding a specific part of one’s body (<30–50 cm)” (de Vignemont, 2011, p. 85). An object or tool is embodied if the object or tool is processed as peri-personal space (de Vignemont, 2011).

It is worth noting that the three senses of embodiment are not independent from each other. For example, brain studies and movement disorders seem to show that there is a mutual relationship between affective embodiment and the motor system (Schütz-Bosbach et al., 2006; Della Gatta et al., 2016; Burin et al., 2017; Fossataro et al., 2018). Affective embodiment requires multisensory integration within a fronto-parietal network. This network includes an important area for the interaction between affective embodiment and motor system, the ventral premotor cortex (Ehrsson et al., 2004; Makin et al., 2008b; Blanke et al., 2015). Also, spinal cord injuries have been shown to have an altered sense of body ownership (Scandola et al., 2014).

Additionally, affective embodiment and some aspects of spatial embodiment, such as proprioceptive drift, reflect different components of the multisensory integration process (Ehrsson et al., 2005; Longo et al., 2008; Makin et al., 2008a; Rohde et al., 2011; Martinaud et al., 2017). For example, fMRI data shows there is a relationship between ventral premotor activation and subjective report, which is related to affective embodiment (Ehrsson et al., 2004) and inferior parietal lobule activation during recalibration of perceived hand position, which is related to proprioceptive drift (Ehrsson et al., 2004, 2005).

Finally, it is important to note the influence of perspective for embodiment although it is still a matter of discussion. On the one hand, some studies have suggested that the first-person perspective is important for embodiment (Blanke and Metzinger, 2009), which seems to be confirmed in rubber hand experiments (Pavani et al., 2000; Kalckert and Ehrsson, 2012; Bucchioni et al., 2016) and full body virtual reality experiments (Petkova and Ehrsson, 2008; Slater et al., 2010; Petkova et al., 2011; Maselli and Slater, 2013). Also, researchers found with the use of a questionnaire (Tieri et al., 2015b), measuring skin conductance response (Tieri et al., 2015a), and electroencephalographic response (Pavone et al., 2016), that passively observing a virtual body in first-person perspective is sufficient for embodiment (Maselli and Slater, 2013; Tieri et al., 2015b, 2017). A similar result found that “visual capture by a fake hand (without any synchronous or asynchronous tactile stimulation) affects body ownership in a group of hemiplegic patients with or without disturbed sensation of limb ownership” (Martinaud et al., 2017, p. 174). On the other hand, others do not observe significant differences between first and third person perspectives for embodiment (Pomés and Slater, 2013).

The Cognitive Science of Embodiment: Two Approaches

When it comes to understanding how we embody objects or tools, two wide theoretical frameworks are used in contemporary research. One framework regards the embodiment of objects or tools as a matter of incorporating them into our body representations, that is the body schema, and/or image. The other framework, that will be referred as radical embodiment, takes the process of embodiment to be the constitution of a new complex system with both somatic and extrasomatic components.

The framework based on body representations posits the existence of some kind of body model or models that represent diverse features of the body and the way non-bodily objects and tools may be integrated into those representations. A way to categorize these representations is to refer to them as the body image or one’s perceptions, beliefs, and attitudes toward their body, and the body schema or one’s motor capacities that work without conscious appraisal (Gallagher and Meltzoff, 1996). The body image is often characterized as a cognitive appraisal, perception, or evaluation of one’s own appearance and body shape, and the related or resulting affect (Levine and Smolak, 2014). The body schema, in contrast, is a non-conscious neural map of the spatial relations among the body parts and the body’s motor capacities. The body schema is a plastic representation (Giummarra et al., 2008; Gallagher, 2013) and is constantly updated due to incoming sensory input (Dijkerman and de Haan, 2007; Giummarra et al., 2008).

Although the dichotomy between body image and body schema is quite standard in the literature, it is by no means uncontested. Specifically, the relationship between body image and body schema as body representations is intensely debated. Some authors defend the existence of a unitary body representation that must encompass the features usually attributed to body images and body schemas as distinct representational entities (de Vignemont and Farnè, 2010). Support for this interpretation comes from the self not being experienced as many separate body parts, but as a single whole-body representation. This is important since the brain might resolve sensory conflicts by checking the compatibility of multisensory input with a prior body representation, which determines what can and cannot be embodied (Tsakiris and Haggard, 2005; De Preester and Tsakiris, 2009; Tsakiris, 2010; Moseley and Flor, 2012). This long-term body representation is an anatomical structure of the body and is continuously updated from the sensory modalities (Botvinick and Cohen, 1998; Naito et al., 2002; Lackner and Dizio, 2005). On this view, the process of “embodiment must respect some basic anatomical constraints. Therefore, only some objects [and tools] under certain circumstances can be processed as if they were parts of one’s body” (de Vignemont and Farnè, 2010, p. 205).

Alternatives to this unitary approach include the dyadic and triadic body representation models (Schwoebel and Coslett, 2005; Gallagher, 2013) or somatosensory streams (Dijkerman and de Haan, 2007). On the one hand, the dyadic model takes the body schema to be an action-based sensorimotor representation, and the body image to be a non-action-based body representation (Gallagher and Cole, 1995; Rossetti et al., 1995; Dijkerman and de Haan, 2007). This is the standard view presented before in this section. On the other hand, the triadic body representation model accepts the standard notion of body schema but dichotomizes the body image into two different representations: the body semantics and the body structural description representations (Schwoebel and Coslett, 2005). Another model has also further divided the body schema into a motor schema and a somatosensory schema so that there are at least five body representations (Anderson, 2018). Here, we will simply accept the simple dyadic model as sufficient for our purposes in framing the following discussion. As prosthetics become more sophisticated in their capacity to offer various kinds of somatic feedback, our understanding of how such objects can be embodied will have to be likewise updated.

The framework of radical embodiment does not appeal to the concept of body representations to understand the way objects and tools can be embodied. In general, radical embodiment takes cognitive systems to be a kind of complex, self-organized system in which many components interact with each other to give rise to a given cognitive ability (Van Orden et al., 2003; Chemero, 2009; Cavagna et al., 2010; Anderson et al., 2012). The different components of these cognitive systems extend through the brain, the body, and the environment; and, in this sense, object and tools of very different kinds may be parts of cognitive systems and, therefore, may be embodied in those systems just insofar as they are properly coupled to them.

Understanding object and tool embodiment in terms of the activity of a cognitive system extending beyond the brain and the body and integrating those objects and tools into that activity, is a common feature within some radical proposals in the embodied approach to cognition. The strongest forms of the hypothesis of the extended mind (Menary, 2010; Kirchhoff and Kiverstein, 2019) and those approaches based on ecological psychology, enactivism, and dynamic systems share this form of radical embodiment. In both cases, the signatures of the embodiment of objects and tools are not taken to be a matter of body representations, but a matter of coupling between the different components of the system (e.g., the body + prothesis, the body + a tool, and so on; see, e.g., Van Orden et al., 2005; Dotov et al., 2010). The nature and degree of such a coupling requires its own theoretical and methodological commitments to be fully understood.

Embodying Objects

Physical objects are those things that are animate or inanimate and can persist through time, such as a car, a pen, or a prosthetic device. Objects are affected by external forces, can be threatening or non-threatening, and can be cognitively reflected on Longo (2016). Some views on the body take it to be a multisensory object that seems to obey our will, has the ability to interact with other objects, can incorporate them, and can be perceived and understood from the inside (Blanke, 2012). Having such understanding of the unique status of the body as an object, the study of the way individuals embody objects has been carried out using experimental paradigms involving fake limbs, like the rubber hand illusion, and immersive environments in virtual reality.

The Rubber Hand Illusion

The rubber hand illusion is a well-established paradigm to study the sense of ownership in healthy individuals (Botvinick and Cohen, 1998; Ehrsson et al., 2004, 2005, 2008; Tsakiris and Haggard, 2005; Tsakiris et al., 2006; Marasco et al., 2011; D’Alonzo and Cipriani, 2012; D’Alonzo et al., 2014; Collins et al., 2016). In the original rubber hand paradigm, participants sit with their left arm resting on a table, hidden behind a screen, and are asked to fixate on an anatomically congruent rubber hand (Botvinick and Cohen, 1998). The experimenter simultaneously stroked the participant’s hand and the fake hand with two paintbrushes and then quantified the effect on affective embodiment by having participants respond to a questionnaire that included nine perceptual effects on a seven-point Likert scale ranging from ‘agree strongly’ to ‘disagree strongly.’ Statements included items such as “It felt as if the rubber hand were my hand” and “It seemed as if I were feeling the touch of the paintbrush in the location where I saw the rubber hand touched” (Botvinick and Cohen, 1998). After 10–15 s of synchronous visuo-tactile stimulation (less than 300 ms of temporal stroking discrepancy), participants reported significantly stronger agreement with such statements as compared to their self-reports after asynchronous stimulation, confirming a vivid sense of ownership over the rubber hand (Ehrsson et al., 2004, 2005; Lloyd, 2007; Shimada et al., 2009).

In addition to eliciting agreement with ownership statements like those above, there were also changes in the feeling of hand position (proprioception), which occurred between 10 and 60 s (Ehrsson et al., 2004, 2005; Lloyd, 2007). When asked to point with their hand where their real hand’s location is, participants tended to locate it as being closer to the non-bodily object or rubber hand, that is, participants exhibited the proprioceptive drift (Botvinick and Cohen, 1998; Ehrsson et al., 2004, 2005; Longo et al., 2009; Kalckert and Ehrsson, 2014). The idea is “proprioception drifts rapidly in the absence of vision, and in the [rubber hand illusion] set-up this results in overwriting the proprioceptive location information of one’s own hand with the visual location information of the rubber hand” (Kammers et al., 2009b, p. 205).

Proprioceptive drift is supported by affective measurements on the real and fake hand. One study found a decrease in the real hand’s skin temperature and a touch dulling effect (Moseley et al., 2008), although the result was not replicated in other studies (Rohde et al., 2013; De Haan et al., 2017). Also, the effects of somatosensory inputs to the real hand are reduced, leading to a lower intensity sensation (Folegatti et al., 2009; Zopf et al., 2011b; Kilteni and Ehrsson, 2017). Further, TMS elicited motor evoked potentials are reduced, suggesting the motor system is less activated toward the real hand’s muscles (Della Gatta et al., 2016). Threatening the fake hand also elicits strong cortical startle responses (Ehrsson et al., 2007; Gentile et al., 2013).

Some studies show that proprioceptive drift can also be experienced while the participant’s real hand maintains normal kinematics, that is, although the conscious sense of hand location is affected, reaching movements are not (Kammers et al., 2009a). According to these studies, even with perceptual or body image changes (i.e., a rubber hand instead of the real hand) the body schema seems to be left unaffected by the drift. This may show that there are different mechanisms for moving the body and judging bodily properties or a dissociation between the body image and the body schema. Also, it might suggest that vision of a body part does not override the elements of proprioception that maintain kinematic functions. Further, this shows a major difference between perceptual embodiment, which consists in object representation within the body image, and motor embodiment, which consists in object representation within the body schema (de Vignemont and Farnè, 2010). While the location of the rubber hand is perceptually embodied, i.e., it is similarly processed as a body part for perceptual tasks, it is not motorically embodied. However, some rubber hand experiments have found decreased motor performance with the real hand (Heed et al., 2011), which seems to entail that the motor system and perceptual judgments are both affected and leaves the question regarding the relationship between body image and body schema still open.

An interesting aspect of the study of proprioception with the experimental setting of the rubber hand illusion is the possibility of using neurophysiological evidence to enrich behavioral evidence and the subjective reports in order to address open questions. For instance, some studies have used evoked potentials (Zeller et al., 2011, 2015) or functional imaging (Limanowski and Blankenburg, 2015) to find out whether the mentioned conflict between vision and proprioception can be resolved. For example, using EEG, researchers have found an amplitude reduction of early evoked potential components over the contralateral somatosensory cortex during the rubber hand experiment (Zeller et al., 2015) suggesting a reduced connectivity in the that cortex (Zeller et al., 2016). Also, not only do schizophrenic patients have a stronger and faster (five times than normal control subjects) onset of the rubber hand experiment, but they also have long latency evoked somatosensory evoked responses (Peled et al., 2000). In a different example, experimenters have explored the role of the premotor cortex and intraparietal sulcus in the sense of ownership also using electroencephalography (Rao and Kayser, 2017). Further, the neural mechanisms, i.e., the multisensory areas of the premotor and intraparietal areas, underlying the experiment have been observed by fMRI in both healthy participants and upper limb amputees (Ehrsson et al., 2004, 2005; Bekrater-Bodmann et al., 2012; Brozzoli et al., 2012; Schmalzl et al., 2013).

Given these results, the rubber hand illusion illustrates that a non-bodily object can be incorporated into both the body image and schema. The brain appears to have modified the body image, possibly causing the proprioceptive drift, so that the participant experiences limb ownership over the rubber hand (Tsakiris and Haggard, 2005). This directly affects the body (motor) schema in that it now displays the real hand as having acquired the position of the fake one (Botvinick and Cohen, 1998; Ehrsson et al., 2004, 2005). Due to the need for a stable body image, the real hand becomes disembodied and the non-bodily object becomes part of the body image. This results in the body (somatosensory) schema being modified since touch is often reported to be felt in the rubber hand (Botvinick and Cohen, 1998; Durgin et al., 2007). The model proposed by Botvinick and Cohen (1998) understand this phenomenon in terms of the malleability of the body representation caused by multisensory processing.

The extent to which non-bodily objects are incorporated into the body image and the body schema can be further measured (Armel and Ramachandran, 2003). For instance, in one study researchers placed a Band-Aid on the participants’ real hand and a table. Researchers then stroked participant’s hidden, real hand while stroking the table at the same time (Armel and Ramachandran, 2003). To quantify the effect of affective embodiment, researchers used questionnaires and skin conductance responses, an objective measure of changes to skin electrical conductance that occur from the anticipation of pain (Armel and Ramachandran, 2003).3 The researchers found that participants had strong skin conductance responses when the table was threatened, i.e., the table’s Band-Aid was partly pulled off, verifying the participants’ questionnaire responses that they felt the table was their hand (Armel and Ramachandran, 2003). Also, participants “frequently reported that the table illusion was vivid when the touch was received through a common covering—the band-aids- but weak in its absence (Armel and Ramachandran, 2003, p. 1505). This suggests that consistency in the seen and felt touch is critical for body ownership. Neuropsychological evidence also confirms that the body image and body schema can be affected by inanimate objects. In one case, a brain-lesioned patient did not experience a sense of ownership toward their arm, hand, and wedding ring on the hand. However, when the ring was removed, the patient viewed it as their own (Aglioti et al., 1996). Such findings indicate the body image can be affected by higher cognitive processes.

In slight opposition to Botvinick and Cohen’s model (1998) and Armel and Ramachandran (2003) proposed the previous results were due to Bayesian perceptual learning based on the strong likelihood that stroking the real hand has caused the multisensory stimulus and not two different events, i.e., the synchronous stroking of the real hand and a table. The brain’s statistical correlations use vision over touch (Armel and Ramachandran, 2003) resulting in the viewed table becoming part of the body image. However, this proposal has been criticized due to a focus on the likelihood of sensory data, and a failure to explain why synchronous stimulation does not always lead to a sense of ownership (Tsakiris and Haggard, 2005; Tsakiris et al., 2009; Tsakiris, 2010; Zopf et al., 2010). For instance, studies have found that when a hidden real hand and visible checkerboard floor of the box were brushed synchronously, the subjective reports indicated a weak or non-existent illusion (Zopf et al., 2010). Notably, the self-reports and the skin conductance responses were much lower when a table was used, suggesting a role for top–down mechanisms in the sense of ownership (Armel and Ramachandran, 2003).

Similarly, the sense of ownership was inhibited when using a wooden block and fingers (Tsakiris et al., 2009) and when participants saw a smaller rubber hand (Pavani and Zampini, 2007). The illusion is also disrupted if the hand is seen as wooden sticks (Tsakiris and Haggard, 2005) or wooden slabs (Guterstam et al., 2013), and if they see their full body as a cuboid with no limbs (Lenggenhager et al., 2007). Additionally, proprioceptive drift might not be explained by statistical correlations since the rubber hand’s position has also been found to drift toward the real hand to a location between both hands (visual drift) (Erro et al., 2018).

A different model to explain the rubber hand experiment’s results is the neurocognitive model, which emphasizes a role for multisensory processing and higher cognitive functions (Tsakiris, 2010). Under this model, the rubber hand illusion results from bottom–up processing of multisensory inputs and top–down processing or stored representations of one’s hands. The sense of ownership is due to three critical comparisons. First, the object’s visual form and the pre-existing reference body model must match (Tsakiris and Haggard, 2005; Costantini and Haggard, 2007). The “more the viewed object matches the structural appearance of the body part’s form, the stronger the experience of the body-ownership will be” (Tsakiris, 2010, p. 13). This is supported by numerous studies (Ehrsson et al., 2005; Tsakiris and Haggard, 2005; Haans et al., 2008; Tsakiris et al., 2009). The sense of ownership is also not affected by skin color and texture, suggesting the body model is different than the body image. In addition, the current body schema state and the object’s anatomical, structural and postural features must match (Tsakiris, 2010). The sense of ownership is elicited if the object’s posture and body part match but reduces with postural and anatomical discrepancies (Ehrsson et al., 2004; Tsakiris and Haggard, 2005; Costantini and Haggard, 2007). For instance, placing the rubber hand 180 and 90 degrees to the real hand reduces the sense of ownership (Ehrsson et al., 2004; Tsakiris and Haggard, 2005; Kalckert and Ehrsson, 2014). Furthermore, the seen and felt touches of the brush must match for the sense of ownership to be elicited (Tsakiris, 2010). Notably, a recently discovered constraint is that exteroceptive signals must be integrated with interoceptive signals, such as from the cardiovascular system (Tsakiris, 2017). These comparisons suggest that for objects to be embodied into the body image, they need to be compatible with our unconscious body schema, such as postural congruency, but not with our current body image.

And yet, Guterstam et al. (2013) observed that participants embodied a volume of empty space, while others found they embodied a rubber hand 3 cm larger than their real hand (Pavani and Zampini, 2007) and three times as long as their real arm (Kilteni et al., 2012). Also, in one study synchronous visuo-tactile stimulation of one real and two rubber hands led participants to feel touch on and ownership toward, although less vividly, two right-handed rubber hands (Ehrsson, 2009). Other studies have found the sense of ownership toward two left hands (Newport et al., 2009) and four hands (Chen et al., 2018).

Rubber hand experiments have also elicited a sense of ownership with visuo-thermal stimulus patterns (Trojan et al., 2018), without vision (Ehrsson et al., 2005; Petkova et al., 2012), and without touch by using a laser pointer (Durgin et al., 2007). Thus, the phenomena are clearly more complex than can be accounted for by simple visuo-tactile integration, and researchers have accordingly moved beyond simple tactile stimulation.

Studies have also provided participants the ability to move the fake limb (Tsakiris et al., 2006; Dummer et al., 2009; Sanchez-Vives et al., 2010) to generate affective embodiment, while others have investigated motor embodiment through questionnaires (Kalckert and Ehrsson, 2012, 2014; Jenkinson and Preston, 2015). In one study, participants could move the rubber hand’s index finger (Kalckert and Ehrsson, 2012). Researchers manipulated the synchrony between the real and fake hand’s finger movements, the mode of movement (passive vs. active), and the rubber hand’s positioning. They found that both affective and motor embodiment are distinct since asynchrony eliminated both, but only motor embodiment was abolished with passive movements, while affective embodiment was reduced with incongruent positioning. Another study also found support for a double dissociation between motor and affective embodiment (Marotta et al., 2017).

Because of these complexities and variety of results, researchers have also begun to explore embodiment in virtual reality, which allows for a greater variety of experimental setups than the simple rubber hand device permits. It is to a review of that literature that we now turn.

Embodying Objects in Virtual Reality

Virtual reality technology creates an interactive environment for its users by way of a diverse number of devices such a head mounted display (HMD), head tracking, real time motion capture, tactile feedback, and audio. To study object embodiment in virtual reality, researchers have used limb illusions (Slater et al., 2008), full body illusions using avatars (Slater et al., 2010), or combined virtual and non-virtual reality conditions (Ijsselsteijn et al., 2006). In one virtual rubber hand experiment that was set up similar to Botvinick and Cohen’s (1998) experiment, questionnaire responses and proprioceptive drift showed that participants had a sense of ownership toward the virtual hand (Slater et al., 2008).

In another study, spinal cord injury patients received tactile back stimulation while viewing virtual legs being touched through an HMD (Pozeg et al., 2017). Studies have found that there is cortical reorganization after spinal cord injuries and the lower back is connected with the leg representations (Wrigley et al., 2009). For this reason, the researchers manipulated the synchrony between the stroking of the virtual legs and the participant’s lower or upper back. The results showed that these patients experienced weaker leg ownership than healthy control participants, with the time since injury also negatively correlated with leg ownership. This suggests that these patients less readily integrate the available visual and tactile information to experience leg ownership. There was also no difference between the lower and upper back conditions, suggesting that instead of a reorganization of the primary somatosensory cortex (S1), that maybe other leg representations are involved or there are larger receptive fields on the back in S1 so that the upper and lower back conditions were engaging similar S1 locations.

In a different study, researchers used an unmediated condition, a virtual reality condition where a video-projection of a rubber hand and its visuo-tactile stimulation was on a flat tabletop surface, and a mixed reality condition, where the fake hand was projected but had unmediated visuo-tactile stimulation (Ijsselsteijn et al., 2006). Self-reports and proprioceptive drift showed that the non-virtual reality condition had the strongest sense of ownership. There was also a stronger sense of ownership in virtual reality than mixed reality but no difference in drift. The existence of differences between the virtual reality and non-virtual reality conditions contradicts a Bayesian learning explanation (Ijsselsteijn et al., 2006). Instead, these differences seem to support the neurocognitive model in that there is a role for top–down mechanisms that specify requirements for an object, since the physical objects would be experienced as more real. Improvements in VR technology may eventually erase this effect.

Virtual reality experiments on object embodiment typically use objective measures, such as defensive motor responses to a threat to a limb (Kilteni et al., 2012), an electroencephalogram via event-related potentials (González-Franco et al., 2014), temperature changes in the hand (Hohwy and Paton, 2010) and a cross-modal congruency task (Pavani et al., 2000; Zopf et al., 2011a), discussed in Section “Tools and Body Representations: Peri-personal Space,” below.

For example, in one study participants had to fixate on the virtual hand of the collocated avatar that had been placed on a desk (González-Franco et al., 2014). Event-related potentials were recorded when a knife attacked the virtual hand and when it struck the virtual table. The results showed that similar to non-virtual reality experiments, participants experienced a sense of ownership toward the virtual hand and body, suggesting that the body image has been manipulated to include the avatar and its virtual hand. Moreover, the strong sense of ownership found in the questionnaires correlated with larger P450 amplitudes. Also, when the virtual hand was threatened the motor cortex had Mu rhythm event-related desynchronization and there was more readiness potential (C3–C4) negativity.

Virtual reality paradigms are also used to study object embodiment for non-bodily objects. In one study, researchers synchronously tapped the participant’s forearm and collocated virtual forearm (Hohwy and Paton, 2010). After 30 s, the virtual forearm was replaced by a virtual cardboard box that was then synchronously tapped with the real forearm. The results showed that some participants believed they felt touch on the virtual cardboard box. However, the cardboard box illusion was not elicited without the preceding virtual hand induction, which may explain Armel and Ramachandran’s (2003) results since participants were exposed to the rubber hand illusion prior to the table touch (Hohwy and Paton, 2010).

Virtual reality paradigms can further expand on how flexible the body image and schema are and what constraints there are to embody an object. Virtual reality can easily stretch body parts past their normal size, such as a finger to double its size (Newport and Preston, 2010) or have one collocated limb that continuously grows in length and moves with the real limb (Kilteni et al., 2012). Virtual and non-virtual reality studies on object embodiment suggest that “the topographic representation in the primary somatosensory cortex (S1) reflects the perceived rather than the physical aspects of peripheral stimulation” (Schaefer et al., 2007, p. 700). In one non-virtual reality study, vision and somatic sensations were manipulated to elicit an illusion of an elongated arm (Schaefer et al., 2007). An artificial hand and arm were attached over the subject’s real one, extending it by about 20 cm. They found a sense of ownership over the lengthened limb by way of a questionnaire and neuromagnetic source imaging, which “revealed a corresponding modulation of S1 to the extent of feeling the arm elongated: the more the subjects felt the arm elongated, the more the cortical distance between D1 and D5 [digit 1 and digit 5 on the left hand] decreased” (Schaefer et al., 2007, p. 703). It was speculated that due to the perception of a longer arm, there was a larger cortical arm representation and hence participants felt a longer arm. In virtual reality, however, there does not seem to be any investigations into the amount of cortical reorganization for different virtual limb lengths (Kilteni et al., 2012). Nonetheless, there seems to be a correlation between perceiving one’s body size and shape and modulations in the cortex (Schaefer et al., 2007; Normand et al., 2011; Kilteni et al., 2012; Banakou et al., 2013; Won et al., 2015).

If there is a top–down perceptual body model and virtual reality experiments give us some insight into the embodiment of objects, then it seems the specific bodily features that it encodes are unclear. Researchers have found that the participants can embody a virtual arm that is triple in length (Kilteni et al., 2012), an avatar of a different race (Peck et al., 2013), and an avatar with a different shaped body (Normand et al., 2011; Won et al., 2015). Adult participants have also embodied a child body (Banakou et al., 2013), and feel identified with realistic androids (Cooney et al., 2012) and unrealistic humanoid robots (Aymerich-Franch et al., 2015). Such virtual reality studies appear to contradict other studies that had used a wooden stick (Tsakiris and Haggard, 2005), a rubber sheet (Haans et al., 2008), and a wooden slab (Guterstam et al., 2013) and found a reduced or eliminated sense of ownership. While a smaller sized hand was shown to inhibit the embodiment (Pavani and Zampini, 2007), other studies showed the embodiment of 30 to 400 cm virtual dolls (Van der Hoort et al., 2011) and full-body mannequins (Petkova and Ehrsson, 2008; Petkova et al., 2011).

To explain these results, functionality may be the key feature of the body model (Aymerich-Franch and Ganesh, 2016). The brain perceives an object as one’s body part or whole-body if the object’s physical properties are sufficient to allow certain actions (Aymerich-Franch and Ganesh, 2016). For instance, the rubber hand and full body illusions are possible even when rubber hands are of different color (Longo et al., 2009) or texture (Haans et al., 2008), and when the body is a different in size, gender, or age since these features do not affect proper functions (Petkova and Ehrsson, 2008; Van der Hoort et al., 2011; Maister et al., 2015). Multiple rubber hands (Ehrsson, 2009), longer arms (Kilteni et al., 2012) and larger rubber hands (Pavani and Zampini, 2007) maintain functionality, while a wooden stick (Tsakiris and Haggard, 2005), a wooden slab (Guterstam et al., 2013), and smaller rubber hands (Pavani and Zampini, 2007) cannot adequately perform tasks.

The variety of embodiment effects, as summarized by this review, seems to suggest that there is not one physical body representation. Instead, it could be the case that these observed embodiment effects are due to the interaction of multiple body representations. Indeed, there is no consensus yet on how many different bodily representations there are, what their characteristics are, and whether and to what degree they can be truly dissociated.

Embodying Tools

Objects are those things that are nameable, identifiable, stable, and can persist through time, such as pencils and cars. Tools are a specific kind of object employed to alter or interact with other objects. Three categories of tools are physical-interaction tools, which interact with the environment; pointing tools, such as a laser; and detached tools, such as a computer mouse (tool) that interacts with objects via an interface (Holmes and Spence, 2005). Tools are different from other objects since they are associated with specific manipulation-actions, such as grasping a hammer and manipulating a nail, used to reach for objects, and used to sense one’s surroundings. This suggests a major role for the sensorimotor system in tool embodiment. They are also different from other objects since individuals typically lack a sense of ownership toward a tool. For Botvinick (2004), “the feeling of ownership that we have for our bodies does not extend to, for example, the fork we use at dinner” (p. 783). As a result, tool embodiment studies do not typically use affective measures, such as using a threat to measure physiological responses, since tool use can involve dangerous situations, such as stirring hot water.

Tool embodiment research focuses on the spatial modulation of multisensory integration following tool use, which involves spatial embodiment, and the plasticity of body representation following tool use, which involves motor embodiment The incorporation of a tool into the body and the expansion of peri-personal space are similar in that they seem to be due changes in the body schema (Iriki et al., 1996; Maravita and Iriki, 2004). But they are also separate in that the body schema requires somatic sensation, while the multisensory peri-personal space is based on vision and audition (Cardinali et al., 2009b).

Tools and Body Representations: Peri-Personal Space

Peri-personal space is located directly around the body and involves multisensory integration in the frontal and parietal lobes (Làdavas and Serino, 2008; Serino, 2019). Evidence suggests that peri-personal space representations are multimodal since they respond to visual information (Longo and Lourenco, 2006) and visuo-tactile information (Noel et al., 2015). Also, bodily dimensions such as body part size changes the size of peri-personal space. For instance, individuals with longer arms tend to have more peri-personal space (Longo and Lourenco, 2007).

Peri-personal space has been divided into near and far space or the space within and beyond reach (Berti and Frassinetti, 2000; Witt et al., 2005). Here, reachability may be a perceptual metric such that everything that is reachable is perceived to be in action space (Witt et al., 2005). Tool use also seems to alter the perception of the environment bringing objects as perceived as farther away as closer and into the action space (Witt et al., 2005), although others have found that near peri-personal space appears to grade off from the body in that it does not abruptly end at arm’s reach (Longo and Lourenco, 2006, 2007). Studies with monkeys have shown that near space and far space are coded differently in their brain (Colby et al., 1996). Also, one PET study showed that during tool use the monkey’s brain showed an increase in the activity and reorganization of the reach representation in the intraparietal region, basal ganglia, premotor cortex, and cerebellum (Obayashi et al., 2001). These results are further supported by human behavioral studies (Berti and Frassinetti, 2000; Maravita et al., 2003; Farnè et al., 2007), PET studies (Weiss et al., 2000) and TMS studies (Lane et al., 2011). Similarly, a PET study on humans showed that during tool use the ipsilaterial posterior parietal cortex was activated (Inoue et al., 2001).

To understand what happens to the peri-personal space during and after tool use with a non-threatening object, researchers trained macaque monkeys to control a rake via a computer screen to reach food placed in far space (Iriki et al., 1996). Reachability was used when macaque monkeys were taught to reach with a rake, which extended their reach (Iriki et al., 1996). This suggests that some visual neurons code reach and adapt to changes in reachability due to tool use. They found that by recording premotor or posterior parietal cortices, tool use expanded the receptive fields of bimodal neurons to include the tool’s entire length. The enlargement of peri-personal space has been interpreted as a remapping of farther away objects as nearer ones (Di Pellegrino et al., 1997; Farnè and Làdavas, 2001). Additionally, when the macaque monkeys passively held the rake, the peri-personal space shrank back to pre-tool size. These results suggest that deliberative action is needed to expand peri-personal space (Iriki et al., 1996; Farnè et al., 2005).

In the neuropsychological literature, peri-personal space is understood as the changes in patients’ perceptual performance as stimuli are moved further away. These changes are grounded in the activity of multisensory neurons, which “decreases as the distance between visual stimuli and tactile stimuli increases” (de Vignemont, 2011, p. 85). Brain injured patients typically experience extinction, which results in contralesional events being “perceived in isolation, yet are missed when presented concurrently with competing events on the ipsilesional side” (Kennett et al., 2010, p. 15). Researchers have used cross-modal extinction, where extinction arises cross-modally, to study these patients’ peri-personal space. Results have shown that the location of visual stimuli, such as when it is close to a body part, can interfere with patients’ tactile performance (Farnè and Làdavas, 2000; Maravita et al., 2001, 2002; Farnè et al., 2005; Bonifazi et al., 2007). When these patients see a visual stimulus by the ipsilesional hand, it can extinguish a touch delivered at the same time on the contralesional hand (Di Pellegrino et al., 1997; Costantini et al., 2007). Typically, as the visual stimulus moves away from the hand or vice versa, tactile detection improves on the other hand (Làdavas, 2002).

Researchers have studied the effects of patients with tactile extinction using a tool on the peri-personal space (Farnè and Làdavas, 2000). They found that short periods of rake use with the ipsilesional hand to reach for objects (red fish) in far space increases extinction of tactile contralesional stimuli (hand touch) after visual stimuli (light flash) were placed close to the tool tip. This suggests peri-personal space expanded to include the rake’s entire axis and that the rake is now part of the body. Also, similar to Iriki et al. (1996), the peri-personal space returned to pretool use size after 5–10 min of passively holding the rake. Further, free hand pointing movements toward the fish had results comparable to the pretool condition. This suggests that mere motor activity is not sufficient to expand peri-hand space (Farnè and Làdavas, 2000).

In a follow-up study, Farnè et al. (2005) used the same task to research whether the absolute of functional tool length was important to modulate the peri-personal space. They used a long (60 cm) wooden rake, a short (30 cm) wooden rake, and a hybrid long (60 cm) wooden rake where the functional part was attached at 30 cm (Farnè et al., 2005). They found that after using the hybrid (60 cm) tool, the crossmodal extinction was compatible with the 30 cm short tool. This suggests that the tool’s functional part is directly related to the expansion of peri-personal space. Indeed, “the main advantage provided by the extension of the peri-hand area, whereby vision and touch are integrated, seems to be that of allocating multisensory processing where the goal of the action is” (Làdavas and Serino, 2008, pp 1106–1107). Similarly, Tomasino et al. (2012) found an increase in fMRI activity in the extrastriata body area when a joystick was used in a more compatible environment in near space than when a less appropriate tool (extended pliers) was used in a less congruent environment in near space. This finding suggests that the body’s neural representation is adapted in a functional manner, depending on tool compatibility.

Another method to assess peri-personal space, the cross-modal congruency task, tests “how strongly visual stimuli affect the processing of simultaneously presented tactile stimuli” (Holmes, 2012, p. 273). Similar to cross-modal extinction, the effect is larger when spatially incongruent distractors are near the tactually simulated hand and reduces when presented close to the opposite hand. For example, in one study participants judged whether tactile vibration was delivered to the thumb or index finger on either hand while holding two golf clubs. Two visual distractor lights were at each far end of both golf clubs (Maravita et al., 2002). Each trial consisted in a vibration from one of four possible locations (finger or thumb on either hand) accompanied with one distractor light. In the uncrossed condition, visual distractors interfered with the tactile discrimination task on the same side of the tool. When tools were crossed the visual distractors had stronger interference on the hand placed in the opposite space. Like in other studies, passively holding the tools did not have this effect.

The studies reviewed suggest that tool-use shows characteristics of tool embodiment, such that peri-personal space extension is dependent on active (Farnè and Làdavas, 2000), goal-directed (Farnè and Làdavas, 2000), and functionally effective (Farnè et al., 2005) tool interaction. Other studies also support that during and after tool use, the peri-personal space is enlarged. For example, researchers have found that using a computer mouse enlarges peri-personal hand space, which may also include the screen monitor (Bassolino et al., 2010). In another study, peri-personal space was expanded when blind individuals merely held the cane without using it Serino et al. (2007). This suggests that “blind people, who continuously use the cane to integrate auditory and tactile information in far space, in order to compensate for the lack of visual information, developed a new, extended representation of auditory peri-hand space, which is selectively activated when holding the cane” (Serino et al., 2007, p. 1108). Long term regular use with a tool then may create a durable extension of peri-personal space. Further, other studies have found that tool use imagery triggers similar effects on the peri-personal space (Baccarini et al., 2014). Notably, the peri-personal space can also contract when weights impair arm movement (Lourenco and Longo, 2009) and after limb loss (Canzoneri et al., 2013).

However, an alternative explanation for some of these results is that “tools act as spatial attentional and motor cues” (Holmes et al., 2007, p. 466). To investigate this, Holmes et al. (2004) modified the cross-modal congruency task by placing the visual distractors near the hand, the tools’ middle, and far from the hand or the tool’s tip. Results showed that the effects of tool use were mostly at the tool’s tips and weakly at the tool’s middle. They also conducted a fMRI study that showed that there was a shift in participants’ spatial attention to the functional part of the tool. They asked participants to ignore visual distractors but pay attention and respond to vibrotactile targets that were at the tool’s tip and felt in the hand. The results were indicative of a shift in spatial attention since visual distractors at the tool’s tip enhanced the BOLD response in retinotopic portions of the occipital visual cortex and decreased the BOLD response in the ipsilateral visual field (Holmes et al., 2008).

Another important research area explores the idea that tool use can make changes to the body schema. Some researchers have indicated that the peri-personal space expansion occurs because the tool is incorporated into body schema, i.e., into the sensory-motor map of the limb with the tool (Iriki et al., 1996; Farnè and Làdavas, 2000; Maravita and Iriki, 2004; Serino et al., 2007). Additionally, some studies have also shown that the tactile signals felt in the hand that occur when a held tool comes into contact with an object are referred directly to the tip of the tool (Yamamoto and Kitazawa, 2001; Maravita et al., 2002; Yamamoto et al., 2005; Collins et al., 2008). This result suggests that somatosensory integration is not required to use a tool. Hammering a nail with a hammer tends to result in the feeling of touch on the part of the hammer touching the nail, and not at the hand holding the tool. This result seems to suggest that the body (somatosensory) schema has been changed and now includes the tool’s functional part. Tool use then not only seems to have effects on peri-personal space coding but also seems to change other aspects of body representations, a topic to which we now turn.

Tool and Body Representations: Body Schema

The peri-personal space and body schema are separate entities that are highly flexible and seem to be connected to the motor system.4 Some studies have found that the modulation of the body schema during tool use involves increasing the length of the arm’s representation and requires goals and motor programs (Cardinali et al., 2009b, 2012; Sposito et al., 2012; Baccarini et al., 2014; Garbarini et al., 2015). For example, Cardinali et al. (2011) found that tool use involving a 40 cm long mechanical grabber resulted in a free hand kinematic pattern compatible with having a longer arm, while there were no changes in the hand-related or grip component. This suggests that for a time after tool use the body schema was still modified as if participants were still using the mechanical grabber and had a longer arm.

In the same study, Cardinali et al. (2011) investigated whether tool use can elicit a functional update of the body schema without affecting the body image by way of a motor and perceptual task (Cardinali et al., 2011). For the motor task, blindfolded participants were asked to point with their left index fingertip to a specific location on their right arm that had been touched by the experimenter, i.e., the finger, wrist, or elbow. For the perceptual task, participants verbally reported where the experimenter had touched them. They found that after tool use participants perceived a longer arm since they localized touches to their elbow and middle fingertip as farther apart. In contrast, there was no observed change when localizing named body parts, suggesting tool use may only affect the body schema.

Other studies have used a direct behavioral task, e.g., an arm bisection task, to investigate whether tool use might alter the metric representation of limbs (Sposito et al., 2012). Researchers asked healthy participants to estimate the subjective midpoint of their own forearm before and after a training phase with long functionally relevant (60 cm) tools and small functionally irrelevant (20 cm) tools. They found that participants indicated a more distal midpoint, thus exhibiting an increased representation of the arm’s length after training with only the long functionally relevant tool.

Similarly, other studies used this task on brain-damaged hemiplegic patients with pathological embodiment (Garbarini et al., 2015). The participants were asked to estimate the midpoint of their paralyzed forearm before and after a training phase in which an experimenter repeatedly used a tool and were aligned or misaligned relative to patients’ shoulders. In the aligned condition, patients thought that they were using the tool with their paralyzed arm. In effect, there was significant modulation of the perceived arm length in that they located their forearm midpoint closer to the hand, indicating an increased length, in the post- training phase. No effect occurred when they were misaligned to the experimenter during the training phase (Garbarini et al., 2015). These “findings show the existence of a tight link between spatial, motor and bodily representations and provide strong evidence that a pathological sense of body ownership can extend to intentional motor processes and modulate the sensory map of action-related body parts” (Garbarini et al., 2015, p. 402).

Cardinali et al. (2016) also found that using tools (sticks and pliers) that elongate the fingers results in an update of the brain’s hand representation due to the tools’ morpho-functional characteristics and its specific sensorimotor constraints. They found that the kinematics of the grasping hand component were affected, but the arm representation and reaching kinematics were not (Cardinali et al., 2016). Similarly, Miller et al. (2014), used a perceptual task to find that the hand representation increased in size when using a tool morphologically similar to the hand. They also found that the use of an arm-like tool affects its representation by an increase in arm length (Miller et al., 2014). Therefore, the plasticity of the limb’s representation is constrained by the resemblance of the tool’s morphology.

Studies have also found that the peri-personal space and the body schema can be separated. One study found that after immobilizing participants’ right limb for 10 h, that the limb had a reduction of peri-personal space but no effect on the body schema since there were no consequences on the perceived limb’s length. They also found the left overused limb’s peri-personal space did not extend while the perceived length increased (Bassolino et al., 2015). These results suggest that both the peri-personal space and body schema depend “on different mechanisms; while [peri-personal space] representation is shaped as a function of the dimension of the acting space, metric characteristics of [body representation] are forged on a complex interplay between visual and sensorimotor information related to the body” (Bassolino et al., 2015, p. 385).

Further, these tool embodiment studies show a role for both somatic sensations and vision in the incorporation of a tool into the body schema. Somatic sensations seem to be necessary and sufficient for tool incorporation into the body schema. In one study using blindfolded participants, Martel et al. (2019) concluded that participants assessed arm length representation at an implicit level by comparing movement kinematics before and after tool use. The results showed that after tool use participants had modified movement kinematics that suggested an increased arm length representation. Martel et al. (2019) also found that “explicit arm representation seems immune to tool-use when only somatosensation is available. When participants were asked to explicitly estimate their arm length, we observed no effect of tool use” (p. 10). Neuropsychological evidence also supports this in the case where a deafferented patient had no incorporation of a tool due to a lack of proprioception (Cardinali et al., 2016).

Tools and Virtual Reality

As in the case of embodying objects, virtual reality provides a good medium to test different measures and to investigate how interacting with virtual tools might compare to their physical counterparts. Studying the expansion of peri-personal space and body schema changes in virtual reality can provide important insights into how these environments effect our movement planning and execution.

The cross-modal congruency task has also been used in virtual reality to test whether peri-personal space can change with virtual robotic tools (Sengül et al., 2012, 2013). Like in Maravita et al.’s (2002) experiment with physical golf clubs, the task was to cross or uncross the virtual golf clubs (in one experiment) and to just hold the tool interface (in the other experiment), where the experimenter did the crossing and uncrossing (Sengül et al., 2012). The latter experiment aimed to explore whether active use of the tool was important to spatial modulation of the crossmodal congruency effect. Subsequently, participants made discriminations of vibrotactile stimuli delivered to the thumb and index finger using a foot pedal while ignoring visual distractors at the virtual tool’s tip. Results showed that virtual-robotic tools can influence multisensory integration in peri-personal space. Moreover, “there was an interaction of vision and touch as reflected in the cross-modal congruency effect (CCE) for virtual robotic tools. Second, it was found that actively crossing the tool resulted in a remapping of peri-personal space, as reflected in a stronger CCE when visual stimuli appeared at a different side than the tactile vibration, at the tip of the tool that was held in the stimulated hand. Third, it was found that this remapping of peripersonal space did not depend on active tool use, as passive crossing of the tools resulted in a change in the CCE side effect” (Sengül et al., 2012, e49473). Like in other non-virtual reality studies then, active crossing of tools results in remapping the peri-personal space at the tool’s tip. However, unlike some non-virtual reality studies, they found that active tool use was not required for the remapping of peri-personal space.

In the context of virtual reality, another study demonstrated that body metric estimation can be modulated by motor embodiment (D’Angelo et al., 2018). Participants were asked to perform a forearm bisection task before and after a training phase, in which they virtually grasped objects on a PC screen with a virtual hand. Researchers used a leap motion controller, a virtual reality device used for hand tracking, to synchronize the virtual hand to the participant’s real hand. In a synchronous condition, participants had collocated virtual hand movements to their own right-hand movements, while in an asynchronous condition had a 3 s delay between the participants’ real hand and the virtual hand movements.

The results seem to confirm that the plastic changes of peri-personal space and the body schema “depend on the experience of controlling the course of events in space through one’s own actions, i.e., the sense of agency” (D’Angelo et al., 2018, p. 1). Participants pointed to their forearm midpoint more distally, consistent with an increased length of the arm representation, after performing the training phase in the synchronous condition only. This suggests that having a different body metric depends on the intention to perform the action being congruent with the movement’s motor output. These findings suggest “that body schema and peri-personal space are affected by the dynamic mapping between intentional body movements and expected consequences in space” (D’Angelo et al., 2018, p. 1).

These studies may be taken as preliminary evidence for virtual tool embodiment. Similar to object embodiment, virtual reality paradigms have the potential to further expand on how flexible the body schema is and what constraints there are to embody a tool. Virtual reality can easily stretch a tool past its original size to see the effects on peri-personal space and the body schema, while also discovering the tool’s breaking point of no longer being a functioning tool for a task and the associated brain changes.

Radically Embodied Tools and Objects

So far, we have reviewed different ways in which the embodiment of tools and objects is understood as affecting body representations. Both in the case of objects and in the case of tools, embodiment occurs when they are incorporated into either the body image (consciously) or into the body schema (somewhat unconsciously). Moreover, we have seen research based on paradigms such as the rubber hand illusion and virtual reality to assess embodiment of different elements and of in different forms. In this section, we will sketch the issue from the point of view of radical embodiment. As a caveat, it is important to note that the embodiment of objects and tools from the point of view of radical embodiment is not essentially different from the one already reviewed. At the end of the day, the radical embodiment of objects and tools also consists of taking extra-somatic parts of our environment to be proper parts of our body. The differences with the framework based on body representations come from the very characterization of such an embodiment in non-representational terms and from some radical consequences that follow from this move. We will turn to these topics but, first, we succinctly introduce radical embodiment.

A Primer on Radical Embodiment

We use the umbrella term of radical embodiment to refer to a group of theories of perception, action, and cognition inspired in phenomenology (Gallagher and Zahavi, 2007; Käufer and Chemero, 2015), ecological psychology (Gibson, 1966, 1979; Michaels and Carello, 1981; Richardson et al., 2008; Chemero, 2009; Turvey, 2019), and enactivism (Varela et al., 1991; Hutto and Myin, 2013; Di Paolo et al., 2017), as well as to some radical proposals that follow from the hypothesis of the extended mind (Menary, 2010; Kirchhoff and Kiverstein, 2019). Although we are well aware that there are more or less important differences between all these approaches (Walter, 2010a, b), we are not going to take issue with them as, for the sake of our study, their similarities overrule their differences.

The features of radical embodiment relevant for our discussion are its commitment to anti-representationalism and its characterization of cognitive systems as complex, self-organized dynamical systems. Cognitive systems are not taken to be in the business of constructing internal representations of the external environment to be able to deal with the latter. On the contrary, cognitive systems are already embodied, embedded systems which are in the business of maintaining their internal dynamics in harmony with their dynamical interaction with the environment. According to radical embodiment, such a fundamental relation between cognitive systems and their environments makes the need for internal representations of the latter either superfluous (Brooks, 1990; Chemero, 2009) or unintelligible in most of the cases (Hutto and Myin, 2013). Mental representations are supposed to play an informative role in cognition. They are the elements cognitive systems use to make epistemic contact with their surroundings. Once that contact is granted through embodiment and embedment and explained in terms of dynamical interactions, the need for mental representations disappear.

For the topic of this article, the consequences of radical embodiment are pretty straight forward: the embodiment of objects and tools is not a matter of incorporating them in body representations. Although nobody denies that the brain carries information about the states of our bodies and environment (and thus in this minimal way “represents”), for radical embodiment this fact is a trivial background condition for the possibility of adaptive behavior and is not in and of itself explanatory. Thus, the embodiment of objects and tools must be explained in some other terms. The question is which ones. In the following we try to sketch them.

Tools of Radical Embodiment

The philosophical works of Martin Heidegger and Maurice Merleau-Ponty are useful to get a sense of the notion of tool and object embodiment from the coordinates of radical embodiment. In a famous passage of Being and Time, Heidegger (1962/2001) considers the foundational aspects of the use of tools. According to him, tools are never meaningless chunks of matter that must be incorporated into our meaningful interactions with the environment by some kind of reflective effort. Rather, we typically encounter tools within our purposive engagements with the world. In this sense, tools are typically about us and about many things in many interrelated ways.

First, a tool—or equipment in the Heideggerian jargon—is always something about what it is for and about the kind of practice or task in which it is used. For this reason, a tool is always about the capacities and interests of the user—the Dasein—and about other tools and objects that belong to the same kind of practices (see, e.g., Heidegger, 1962/2001, p. 97). A pan is for cooking, but a specific kind of cooking, one that requires oil and fire rather than, say, citrus juices, ají, and salt as in the case of ceviche. As such, a pan is the kind of tool it is by virtue of its role in the specific practice of cooking in which it is used; that is, a pan only make sense as a pan when it is on a fire, with oil, with some kind of food within it, etc.

As important as fire, oil, and food for the pan to be a pan are the abilities and interests of the users of pans. Heidegger claims: “The work produced [by tools] refers not only to the ‘toward-which’ of its usability and the ‘whereof’ of which it consists: under simple craft conditions it also has an assignment to the person who is to use it or wear it” (Heidegger, 1962/2001, p. 100). In other words, a tool is by itself always about the user: there is always a meaningful relationship between tools and users just by virtue of being tools and users. A tool is about users in that it is for them to do something with it that is “for-the-sake-of” and determined by the “totality of [their] involvements” (Heidegger, 1962/2001, p. 116). Such a referential aspect of the involvement of users and tools inspires a non-representational understanding of tool use and embodiment. In a famous locution, tools are ready-to-hand for the Dasein (Heidegger, 1962/2001, p. 91 & ff.); namely, users engage with tools and tools are incorporated into the users’ dynamics without requiring any conceptual work from them. Tools are ready to be used and, when that happens, they become tacitly integrated into the user. Such is the Heideggerian way to refer to something equivalent to tool embodiment.

The key aspect of Heidegger’s point regarding tools is that they can become embodied without the need for conscious reflection. Prima facie, such an approach is compatible with a strong form of anti-representationalism: no need for conscious assessment of tool embodiment means no need for a conscious mental representation into which the tool must be integrated. In this sense, Heidegger seems to avoid the need for a body image to understand tool embodiment. However, his proposal still seems to be compatible with some notion of body schema, as it is unconscious by definition (Gallagher, 2000). Moreover, the concept of body schema itself has been used within the phenomenological (and Heideggerian) tradition. For example, Merleau-Ponty (1945/2012) uses the notion of schema corporel, which he also calls the “lived” or “habitual” body, to refer to the relevant understanding to of the body to capture human behavior and engagement with the different elements of the environment.

Given the theoretical compatibility between the phenomenological tradition that is at the roots of radical embodiment and the notion of body schema usually entertained in the representational framework of object and tool embodiment, there must be something else to distinguish between the two frameworks. As the notion of body schema is ambiguous regarding its representational character—i.e., it can be understood in terms of representations, like in the representational framework of object and tool embodiment, and in non-representational terms, like in the case of Heidegger or Merleau-Ponty—the non-representational character of radical embodiment must be manifest in other considerations. If this were not the case, the distinction between the two frameworks would be a matter of linguistic preference if not directly trivial. So, what are these considerations that make radical embodiment actually radical and anti-representational?

The relevant considerations are methodological. The differences between the representational framework of object and tool embodiment and the framework based on radical embodiment do not (only) lie on theoretical grounds. It is true that there are some theoretical issues and incompatibilities between them, but it is also true that notions as body schema may be a point of connection if the debate were strictly held in theoretical terms. However, the differences between both frameworks become more salient when the methods to assess tool embodiment are studied. As we have seen in previous sections, when it comes to understanding the embodiment of tools in terms of body image and body schema the methods used have to do, for instance, with the illusions provoked by the rubber hand paradigm, with reaction times, with personal reports, etc. In the case of radical embodiment, the measurements of the embodiment of object and tools are completely different and have to do with the tools provided by complexity science (Riley and Van Orden, 2005; Holden et al., 2013).

As we have noted in the previous section, from the point of view of radical embodiment, cognitive systems are complex, self-organized dynamical systems. This amounts to saying that cognitive systems are systems that impose their own order (aka organization) on themselves by virtue of the ongoing interactions between their components and of the ongoing interactions of the whole system with its environment. Given this, it is commonly acknowledged that this kind of system exhibits some specific patterns of complexity by virtue of their self-organization and the foundation on multiple interactions at multiple scales such as organization around critical states and fractal features (Juarrero, 1999; Riley and Turvey, 2002; Van Orden et al., 2003; Carello and Moreno, 2005; Stephen and Dixon, 2009; Kuznetsov et al., 2013; Lamb and Chemero, 2013).

The features of complex, self-organized dynamical systems have been studied both to determine the kinds of activities cognitive systems are involved (Holden et al., 2009) and to characterize and individuate cognitive systems themselves (Van Orden et al., 2005). In the latter context, these features are relevant to understand object and tool embodiment. The underlying idea is that, when an object or a tool is embodied in a cognitive system, the tool becomes a proper part of the cognitive system and, therefore, the whole system (e.g., human body + tool) must exhibit the expected signatures of self-organization and complexity, such as a fractal organization of its different scales. In this sense, the studies of object and tool embodiment from the standpoint of radical embodiment do not really look for reaction times simpliciter or for participants’ reports of ownership. Those studies look for the features of complex, self-organized dynamical systems at the scale of the cognitive systems and the object/tool as a whole, unitary system. If those features are present, the object/tool may be said to be embodied in the cognitive system. If not, the object/tool is disembodied.

The work developed by Dotov and colleagues is a paradigmatic example of this kind of study (Dotov et al., 2010, 2017). In their studies, Dotov and colleagues study tool embodiment by explicitly addressing the Heideggerian notions of readiness-to-hand, roughly equivalent to embodiment, and presence-at-hand, roughly equivalent to disembodiment. In other words, when a tool is embodied in a cognitive system, it is ready-to-hand and, when a tool is not embodied in a cognitive system, it is present-at-hand. For example, when a person rides a bike and it works perfectly, the bike is ready-to-hand, meaning that the bike becomes embodied and the whole person-bike system exhibits the fractal features of complex, self-organized dynamical systems. When the bike stops working for some mechanical reason, for example, it becomes present-at-hand, meaning that the bike becomes disembodied and the whole-person bike system is not a unitary system anymore and does not exhibit the fractal features of complex, self-organized dynamical systems. The work of Dotov and colleagues shows that these transitions and their fractal signatures are common in our engagement with tools and take them to be examples of tool embodiment. In sum, the traditional view takes embodiment to indicate an alteration in one or more body representations; radical embodiment takes it to indicate the achievement of a particular kind of dynamic coupling.

What has not to our knowledge been studied, but should be, is the relationship between these kinds of signatures of synergistic coupling between bodies and tools, and the feeling of ownership and embodiment of, for instance, a prosthetic limb. Such work could inform prosthetic design, as it may be that the ease with which a prosthetic can be coupled to the brain-body-environment system is a better predictor of the likelihood of prosthetic embodiment [an important aspect of positive experiences with prosthetics (Anderson, 2018)] than are aesthetic design dimensions. Such work could be carried out in VR environments, which would give fine control over the dynamics of the prosthetic, allowing exploration of the different ways a prosthetic might ease or disrupt coupling, and the effect of such manipulations on experience.

Conclusion

The study of object and tool embodiment is already a mature field in which different frameworks, methodologies, and experimental paradigms work in parallel, and sometimes together, to understand the ways and conditions under which we take objects and tools to be part of our bodies. In this paper, we have seen that such study is pursued both from a representational and from a radical embodied point of view. We have also seen that, despite the apparent incompatibilities between the two points of view, notions such as body schema may offer an important connection between them. In this sense, the difference between the representational and the radical embodied notions of object and tool embodiment is mostly methodological. While the representational point of view uses methods like personal reports, skin conductance measurements, etc., the radial embodied point of view uses the tools provided by complexity science to see whether there are tacit, non-conscious engagements with object or tools that can be labeled as embodied.

The theoretical compatibility between both points of view opens an interesting path of research in which semantic debates could be set aside (e.g., the debate on whether the body schema is a proper representation or not) and experimental work from both frameworks could inform each other to achieve a better understanding of the phenomenon and could be combined to improve future research. For example, in both tool and rubber hand studies, it is unclear whether the multisensory effects of tool embodiment can be ascribed to a change in the body schema, in the processing of the peri-personal space, and/or a shift in attention.

It has been shown that using a tool quickly modifies the perception of the peri-personal space, the kinematics of subsequent bodily movements, and the perceived size of the limb. However, this might suggest a faster mechanism, such as shift or projection of spatial attention toward the tool tip rather than a likely slower mechanism, such as tool embodiment (Holmes et al., 2007). The methods used in the radical embodied approach to embodying object and tools (e.g., Dotov et al., 2010) could help to further clarify this issue as they provide a neat way to better understand whether the body schema is involved or not in a given task—by discriminating between readiness and unreadiness-to-hand, for example.

Another field for future research has to do with the ability to use tools and have them become a part of one’s own body while interacting with them in the virtual space. It is difficult to know when a virtual tool has become embodied. Some researchers suggest that instead of questionnaires that we evaluate these technologies by using change in attention as a measure of how people interact with virtual environments (Dotov et al., 2010). Again, the methodological approach of radical embodiment could be used to gain better understanding of this issue.

There are also some non-virtual reality tasks that could be used in virtual reality to study the flexibility of the body representations and the somatosensory cortex, such as with the elongation of a limb (Kilteni et al., 2012), and the brain areas responsible for object and tool embodiment. Further work could be done on the amount of cortical reorganization under various lengths of the virtual limbs for object embodiment and tools for tool embodiment. More research could be on how fast or gradual the peri-personal space reduces after tool use and how fast the cortical reorganization and its reduction is. Future studies could also expand on our understanding of clinical populations and, for instance, their amputated arms or legs by way of objective measures, such as fMRI, in both non-virtual reality and virtual reality rubber hand and full body paradigms. Also, future studies are needed to better understand the use of a robotic touch interface (Marasco et al., 2011) and its effects on the sense of ownership over a prosthetic limb in both non-virtual reality and virtual reality paradigms.

Understanding the underlying brain mechanics is crucial for a full understanding of how object and tool embodiment can work with body representations. Further work on studying experienced and inexperienced tool users with participants in ecologically valid non-virtual and virtual settings will expand the tool use literature. Additionally, further methods to study the body image during tool use should be explored. Continuing to research on tool and object embodiment may allow for the development of more effective prosthetics for missing limbs. In any case, all these are open questions that can be addressed from the framework based on body representations or from a different one.

Author Contributions

All authors listed have made a substantial, direct and intellectual contribution to the work, and approved it for publication.

Funding

This research was supported in part by a Canada Research Chair award to MA, grant number SSHRC 950-231929.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Footnotes

  1. ^ As we will detail below, the number and types of body representations that exist is a matter of dispute. See Anderson (2018) for discussion. For our purposes we will be content with the standard distinction between body image and schema.
  2. ^ It is worth noting that de Vignemont (2011) refers to affective, motor, and spatial embodiment as implicit measurements of embodiment. We do not use this wording in our article because we take them to be aspects of a conceptual analysis of embodiment that can actually be mapped onto the senses of ownership, agency, and self-location respectively. Also, we prefer to use the wording of “measurements” for the explicit measurements in the literature. We want to thank an reviewer for pointing out this fact and giving us the opportunity to clarify it.
  3. ^ Other experiments studying affective embodiment have also used skin conductance responses using a similar experimental setting (Ehrsson et al., 2007; Guterstam et al., 2011).
  4. ^ It is worth noting that, despite this classical view regarding the differences between peri-personal space and body schema, some researchers have suggested they are just two labels for the same concept (Cardinali et al., 2009a).

References

Aglioti, S., Smania, N., Manfredi, M., and Berlucchi, G. (1996). Disownership of left hand and objects related to it in a patient with right brain damage. Neuroreport 8, 293–296. doi: 10.1097/00001756-199612200-199612258

PubMed Abstract | CrossRef Full Text | Google Scholar

Anderson, M. L. (2018). What phantom limbs are. Conscious. Cogn. 64, 216–226. doi: 10.1016/j.concog.2018.08.001

PubMed Abstract | CrossRef Full Text | Google Scholar

Anderson, M. L., Richardson, M. J., and Chemero, A. (2012). Eroding the boundaries of cognition: implications of embodiment. Top. Cogn. Sci. 4, 717–730. doi: 10.1111/j.1756-8765.2012.01211.x

PubMed Abstract | CrossRef Full Text | Google Scholar

Armel, K. C., and Ramachandran, V. S. (2003). Projecting sensations to external objects: evidence from skin conductance response. Proc. R. Soc. Lond. Ser. B Biol. Sci. 270, 1499–1506. doi: 10.1098/rspb.2003.2364

PubMed Abstract | CrossRef Full Text | Google Scholar

Aymerich-Franch, L., and Ganesh, G. (2016). The role of functionality in the body model for self-attribution. Neurosci. Res. 104, 31–37. doi: 10.1016/j.neures.2015.11.001

PubMed Abstract | CrossRef Full Text | Google Scholar

Aymerich-Franch, L., Petit, D., Ganesh, G., and Kheddar, A. (2015). “Embodiment of a humanoid robot is preserved during partial and delayed control,” in Proceedings of the IEEE International Workshop on Advanced Robotics and its Social Impacts (ARSO 2015), Lyon.

Google Scholar

Baccarini, M., Martel, M., Cardinali, L., Sillan, O., Farnã__, A., and Roy, A. C. (2014). Tool use imagery triggers tool incorporation in the body schema. Front. Psychol. 5:492. doi: 10.3389/fpsyg.2014.00492

PubMed Abstract | CrossRef Full Text | Google Scholar

Banakou, D., Groten, R., and Slater, M. (2013). Illusory ownership of a virtual child body causes overestimation of object sizes and implicit attitude changes. Proc. Natl. Acad. Sci. U.S.A. 110, 12846–12851. doi: 10.1073/pnas.1306779110

PubMed Abstract | CrossRef Full Text | Google Scholar

Bassolino, M., Finisguerra, A., Canzoneri, E., Serino, A., and Pozzo, T. (2015). Dissociating effect of upper limb non-use and overuse on space and body representations. Neuropsychologia 70, 385–392. doi: 10.1016/j.neuropsychologia.2014.11.028

PubMed Abstract | CrossRef Full Text | Google Scholar

Bassolino, M., Serino, A., Ubaldi, S., and Làdavas, E. (2010). Everyday use of the computer mouse extends peripersonal space representation. Neuropsychologia 48, 803–811. doi: 10.1016/j.neuropsychologia.2009.11.009

PubMed Abstract | CrossRef Full Text | Google Scholar

Bekrater-Bodmann, R., Foell, J., Diers, M., and Flor, H. (2012). The perceptual and neuronal stabil- ity of the rubber hand illusion across contexts and over time. Brain Res. 1452, 130–139. doi: 10.1016/j.brainres.2012.03.001

PubMed Abstract | CrossRef Full Text | Google Scholar

Berti, A., and Frassinetti, F. (2000). When far becomes near: remapping of space by tool use. J. Cogn. Neurosci. 12, 415–420. doi: 10.1162/089892900562237

PubMed Abstract | CrossRef Full Text | Google Scholar

Blanke, O. (2012). Multisensory brain mechanisms of bodily self-consciousness. Nat. Rev. Neurosci. 13, 556–571. doi: 10.1038/nrm3292

PubMed Abstract | CrossRef Full Text | Google Scholar

Blanke, O., and Metzinger, T. (2009). Full-body illusions and minimal phenomenal selfhood. Trends Cogn. Sci. 13, 7–13. doi: 10.1016/j.tics.2008.10.003

PubMed Abstract | CrossRef Full Text | Google Scholar

Blanke, O., Slater, M., and Serino, A. (2015). Behavioral, neural, and computational principles of bodily self-consciousness. Neuron 88, 145–166. doi: 10.1016/j.neuron.2015.09.029

PubMed Abstract | CrossRef Full Text | Google Scholar

Bonifazi, S., Farnè, A., Rinaldesi, L., and Làdavas, E. (2007). Dynamic size-change of peri-hand space through tool-use: spatial extension or shift of the multi-sensory area. J. Neuropsychol. 1, 101–114. doi: 10.1348/174866407x180846

PubMed Abstract | CrossRef Full Text | Google Scholar

Botvinick, M. (2004). Probing the neural basis of body ownership. Science 305, 782–783. doi: 10.1126/science.1101836

PubMed Abstract | CrossRef Full Text | Google Scholar

Botvinick, M., and Cohen, J. (1998). Rubber hands ‘feel’ touch that eyes see. Nature 391, 756–756. doi: 10.1038/35784

PubMed Abstract | CrossRef Full Text | Google Scholar

Brooks, R. A. (1990). Elephants don’t play chess. Rob. Auton. Syst. 6, 3–15. doi: 10.1016/s0921-8890(05)80025-9

CrossRef Full Text | Google Scholar

Brozzoli, C., Gentile, G., and Ehrsson, H. H. (2012). Thats near my hand! parietal and premotor coding of hand-centered space contributes to localization and self- attribution of the hand. J. Neurosci. 32, 14573–14582. doi: 10.1523/jneurosci.2660-12.2012

PubMed Abstract | CrossRef Full Text | Google Scholar

Bucchioni, G., Fossataro, C., Cavallo, A., Mouras, H., Neppi-Modona, M., and Garbarini, F. (2016). Empathy or ownership? evidence from corticospinal excitability modulation during pain observation. J. Cogn. Neurosci. 28, 1760–1771. doi: 10.1162/jocn_a_01003

PubMed Abstract | CrossRef Full Text | Google Scholar

Burin, D., Garbarini, F., Bruno, V., Fossataro, C., Destefanis, C., Berti, A., et al. (2017). Movements and body ownership: evidence from the rubber hand illusion after mechanical limb immobilization. Neuropsychologia 107, 41–47. doi: 10.1016/j.neuropsychologia.2017.11.004

PubMed Abstract | CrossRef Full Text | Google Scholar

Canzoneri, E., Marzolla, M., Amoresano, A., Verni, G., and Serino, A. (2013). Amputation and prosthesis implantation shape body and peripersonal space representations. Sci. Rep. 3:2844. doi: 10.1038/srep02844

PubMed Abstract | CrossRef Full Text | Google Scholar

Cardinali, L., Brozzoli, C., and Farnè, A. (2009a). Peripersonal space and body schema: two labels for the same concept? Brain Topogr. 21, 252–260. doi: 10.1007/s10548-009-0092-7

PubMed Abstract | CrossRef Full Text | Google Scholar

Cardinali, L., Frassinetti, F., Brozzoli, C., Urquizar, C., Roy, A. C., and Farnè, A. (2009b). Tool-use induces morphological updating of the body schema. Curr. Biol. 19, R478–R479. doi: 10.1016/j.cub.2009.05.009

PubMed Abstract | CrossRef Full Text | Google Scholar

Cardinali, L., Brozzoli, C., Finos, L., Roy, A., and Farnè, A. (2016). The rules of tool incorporation: tool morpho-functional & sensori-motor constraints. Cognition 149, 1–5. doi: 10.1016/j.cognition.2016.01.001

PubMed Abstract | CrossRef Full Text | Google Scholar

Cardinali, L., Brozzoli, C., Urquizar, C., Salemme, R., Roy, A., and Farnè, A. (2011). When action is not enough: tool-use reveals tactile-dependent access to Body Schema. Neuropsychologia 49, 3750–3757. doi: 10.1016/j.neuropsychologia.2011.09.033

PubMed Abstract | CrossRef Full Text | Google Scholar

Cardinali, L., Jacobs, S., Brozzoli, C., Frassinetti, F., Roy, A. C., and Farnè, A. (2012). Grab an object with a tool and change your body: tool-use-dependent changes of body representation for action. Exp. Brain Res. 218, 259–271. doi: 10.1007/s00221-012-3028-5

PubMed Abstract | CrossRef Full Text | Google Scholar

Carello, C., and Moreno, M. A. (2005). “Why nonlinear methods,” in Tutorials in Contemporary Nonlinear Methods for the Behavioral Sciences, eds M. A. Riley and G. C. Van Orden, (Arlington, VA: National Science Foundation), 1–25.

Google Scholar

Cavagna, A., Cimarelli, A., Giardina, I., Parisi, G., Santagati, R., Stefanini, F., et al. (2010). Scale-free correlations in starling flocks. Proc. Natl. Acad. Sci. 107, 11865–11870. doi: 10.1073/pnas.1005766107

PubMed Abstract | CrossRef Full Text | Google Scholar

Chemero, A. (2009). Radical Embodied Cognitive Science. Cambridge, MA: MIT Press.

Google Scholar

Chen, W., Huang, H., Lee, Y., and Liang, C. (2018). Body ownership and the four-hand illusion. Sci. Rep. 8:2153. doi: 10.1038/s41598-018-19662-x

PubMed Abstract | CrossRef Full Text | Google Scholar

Colby, C. L., Duhamel, J. R., and Goldberg, M. E. (1996). Visual, presaccadic, and cognitive activation of single neurons in monkey lateral intraparietal area. J. Neurophysiol. 76, 2841–2852. doi: 10.1152/jn.1996.76.5.2841

PubMed Abstract | CrossRef Full Text | Google Scholar

Collins, K. L., Guterstam, A., Cronin, J., Olson, J. D., Ehrsson, H. H., and Ojemann, J. G. (2016). Ownership of an artificial limb induced by electrical brain stimulation. Proc. Natl. Acad. Sci. 114, 166–171. doi: 10.1073/pnas.1616305114

PubMed Abstract | CrossRef Full Text | Google Scholar

Collins, T., Schicke, T., and Röder, B. (2008). Action goal selection and motor planning can be dissociated by tool use. Cognition 109, 363–371. doi: 10.1016/j.cognition.2008.10.001

PubMed Abstract | CrossRef Full Text | Google Scholar

Cooney, M. D., Nishio, S., and Ishiguro, H. (2012). “Recognizing affection for a touch- based interaction with a humanoid robot,” in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Piscataway, NJ.

Google Scholar

Costantini, M., Bueti, D., Pazzaglia, M., and Aglioti, S. M. (2007). Temporal dynamics of visuo-tactile extinction within and between hemispaces. Neuropsychology 21, 242–250. doi: 10.1037/0894-4105.21.2.242

PubMed Abstract | CrossRef Full Text | Google Scholar

Costantini, M., and Haggard, P. (2007). The rubber hand illusion: sensitivity and reference frame for body ownership. Conscious. Cogn. 16, 229–240. doi: 10.1016/j.concog.2007.01.001

PubMed Abstract | CrossRef Full Text | Google Scholar

D’Alonzo, M., and Cipriani, C. (2012). Vibrotactile sensory substitution elicits feeling of ownership of an alien hand. PLoS One 7:e50756. doi: 10.1371/journal.pone.0050756

PubMed Abstract | CrossRef Full Text | Google Scholar

D’Alonzo, M., Clemente, F., and Cipriani, C. (2014). Vibrotactile stimulation promotes embodiment of an alien hand in amputees with phantom sensations. IEEE Trans. Neural. Syst. Rehabil. Eng. 23, 450–457. doi: 10.1109/tnsre.2014.2337952

PubMed Abstract | CrossRef Full Text | Google Scholar

D’Angelo, M., di Pellegrino, G., Seriani, S., Gallina, P., and Frassinetti, F. (2018). The sense of agency shapes body schema and peripersonal space. Sci. Rep. 8:13847. doi: 10.1038/s41598-018-32238-z

PubMed Abstract | CrossRef Full Text | Google Scholar

De Haan, A. M., Stralen, H. E. V., Smit, M., Keizer, A., Stigchel, S. V. D., and Dijkerman, H. C. (2017). No consistent cooling of the real hand in the rubber hand illusion. Acta Psychol. 179, 68–77. doi: 10.1016/j.actpsy.2017.07.003

PubMed Abstract | CrossRef Full Text | Google Scholar

De Preester, H., and Tsakiris, M. (2009). Body-extension versus body-incorporation: is there a need for a body-model? Phenomenol. Cogn. Sci. 8, 307–319. doi: 10.1007/s11097-009-9121-y

CrossRef Full Text | Google Scholar

de Vignemont, F. (2011). Embodiment, ownership and disownership. Conscious. Cogn. 20, 82–93. doi: 10.1016/j.concog.2010.09.004

PubMed Abstract | CrossRef Full Text | Google Scholar

de Vignemont, F., and Farnè, A. (2010). Widening the body to rubber hands and tools: what’s the difference? Rev. Neuropsychol. 3, 203–211.

Google Scholar

Della Gatta, F., Garbarini, F., Puglisi, G., Leonetti, A., Berti, A., and Borroni, P. (2016). Decreased motor cortex excitability mirrors own hand disembodiment during the rubber hand illusion. eLife Sci. 5:e14972. doi: 10.7554/elife.14972

PubMed Abstract | CrossRef Full Text | Google Scholar

Di Paolo, E. A., Buhrmann, T., and Barandiaran, X. E. (2017). Sensorimotor Life: An Enactive Proposal. Oxford: Oxford University Press.

Google Scholar

Di Pellegrino, G., Làdavas, E., and Farné, A. (1997). Seeing where your hands are. Nature 388, 730–730. doi: 10.1038/41921

PubMed Abstract | CrossRef Full Text | Google Scholar

Dijkerman, H. C., and de Haan, E. H. (2007). Somatosensory processes subserving perception and action. Behav. Brain Sci. 30, 189–201. doi: 10.1017/s0140525x07001392

PubMed Abstract | CrossRef Full Text | Google Scholar

Dotov, D., Nie, L., Wojcik, K., Jinks, A., Yu, X., and Chemero, A. (2017). Cognitive and movement measures reflect the transition to presence-at-hand. New Ideas Psychol. 45, 1–10. doi: 10.1016/j.newideapsych.2017.01.001

CrossRef Full Text | Google Scholar

Dotov, D. G., Nie, L., and Chemero, A. (2010). A demonstration of the transition from ready-to-hand to unready-to-hand. PLoS One 5:e9433. doi: 10.1371/journal.pone.0009433

PubMed Abstract | CrossRef Full Text | Google Scholar

Dummer, T., Picot-Annand, A., Neal, T., and Moore, C. (2009). Movement and the rubber hand illusion. Perception 38, 271–280. doi: 10.1068/p5921

PubMed Abstract | CrossRef Full Text | Google Scholar

Durgin, F. H., Evans, L., Dunphy, N., Klostermann, S., and Simmons, K. (2007). Rubber hands feel the touch of light. Psychol. Sci. 18, 152–157. doi: 10.1111/j.1467-9280.2007.01865.x

PubMed Abstract | CrossRef Full Text | Google Scholar

Ehrsson, H. H. (2009). How many arms make a pair? perceptual illusion of having an additional limb. Perception 38, 310–312. doi: 10.1068/p6304

PubMed Abstract | CrossRef Full Text | Google Scholar

Ehrsson, H. H., Holmes, N., and Passingham, R. (2005). Touching a rubber hand: feeling of body ownership is associated with activity in multisensory brain areas. J. Neurosci. 25, 10564–10573. doi: 10.1523/jneurosci.0800-05.2005

PubMed Abstract | CrossRef Full Text | Google Scholar

Ehrsson, H. H., Rosen, B., Stockselius, A., Ragno, C., Kohler, P., and Lundborg, G. (2008). Upper limb amputees can be induced to experience a rubber hand as their own. Brain 131, 3443–3452. doi: 10.1093/brain/awn297

PubMed Abstract | CrossRef Full Text | Google Scholar

Ehrsson, H. H., Spence, C., and Passingham, R. (2004). That’s my hand! Activity in premotor cortex reflects feeling of ownership of a limb. Science 305, 875–877. doi: 10.1126/science.1097011

PubMed Abstract | CrossRef Full Text | Google Scholar

Ehrsson, H. H., Wiech, K., Weiskopf, N., Dolan, R. J., and Passingham, R. E. (2007). Threatening a rubber hand that you feel is yours elicits a cortical anxiety response. Proc. Natl. Acad. Sci. U.S.A. 104, 9828–9833. doi: 10.1073/pnas.0610011104

PubMed Abstract | CrossRef Full Text | Google Scholar

Erro, R., Marotta, A., Tinazzi, M., Frera, E., and Fiorio, M. (2018). Judging the position of the artificial hand induces a “visual” drift towards the real one during the rubber hand illusion. Sci. Rep. 8:2531. doi: 10.1038/s41598-018-20551-6

PubMed Abstract | CrossRef Full Text | Google Scholar

Farnè, A., Iriki, A., and Làdavas, E. (2005). Shaping multisensory action–space with tools: evidence from patients with cross-modal extinction. Neuropsychologia 43, 238–248. doi: 10.1016/j.neuropsychologia.2004.11.010

PubMed Abstract | CrossRef Full Text | Google Scholar

Farnè, A., and Làdavas, E. (2000). Dynamic size-change of hand peripersonal space following tool use. Neuroreport 11, 1645–1649. doi: 10.1097/00001756-200006050-00010

PubMed Abstract | CrossRef Full Text | Google Scholar

Farnè, A., and Làdavas, E. (2001). Auditory peripersonal space in humans: a case of auditory-tactile extinction. Neurocase 7, 103–103. doi: 10.1093/neucas/7.2.103

PubMed Abstract | CrossRef Full Text | Google Scholar

Farnè, A., Serino, A., and Làdavas, E. (2007). Dynamic size-change of peri-hand space following tool-use: determinants and spatial characteristics revealed through cross-modal extinction. Cortex 43, 436–443. doi: 10.1016/s0010-9452(08)70468-4

PubMed Abstract | CrossRef Full Text | Google Scholar

Folegatti, A., Vignemont, F. D., Pavani, F., Rossetti, Y., and Farnè, A. (2009). Losing ones hand: visual-proprioceptive conflict affects touch perception. PLoS One 4:e6920. doi: 10.1371/journal.pone.0006920

PubMed Abstract | CrossRef Full Text | Google Scholar

Fossataro, C., Bruno, V., Giurgola, S., Bolognini, N., and Garbarini, F. (2018). Losing my hand. body ownership attenuation after virtual lesion of the primary motor cortex. Eur. J. Neurosci. 48, 2272–2287. doi: 10.1111/ejn.14116

PubMed Abstract | CrossRef Full Text | Google Scholar

Gallagher, S. (2000). Philosophical conceptions of the self: implications for cognitive science. Trends Cogn. Sci. 4, 14–21. doi: 10.1016/s1364-6613(99)01417-5

PubMed Abstract | CrossRef Full Text | Google Scholar

Gallagher, S. (2013). How the Body Shapes the Mind. Oxford: Clarendon Press.

Google Scholar

Gallagher, S., and Cole, J. (1995). Body schema and body image in a deafferented subject. J. Mind Behav. 16, 369–390.

Google Scholar

Gallagher, S., and Meltzoff, A. N. (1996). The earliest sense of self and others: merleau- Ponty and recent developmental studies. Philos. Psychol. 9, 211–233. doi: 10.1080/09515089608573181

PubMed Abstract | CrossRef Full Text | Google Scholar

Gallagher, S., and Zahavi, D. (2007). The Phenomenological Mind: An Introduction to Philosophy of Mind and Cognitive Science. New York, NY: Routledge.

Google Scholar

Garbarini, F., Fossataro, C., Berti, A., Gindri, P., Romano, D., and Pia, L. (2015). When your arm becomes mine: pathological embodiment of alien limbs using tools modulates own body representation. Neuropsychologia 70, 402–413. doi: 10.1016/j.neuropsychologia.2014.11.008

PubMed Abstract | CrossRef Full Text | Google Scholar

Gentile, G., Guterstam, A., Brozzoli, C., and Ehrsson, H. H. (2013). Disintegration of multisensory signals from the real hand reduces default limb self-attribution: an fMRI study. J. Neurosci. 33, 13350–13366. doi: 10.1523/jneurosci.1363-13.2013

PubMed Abstract | CrossRef Full Text | Google Scholar

Gibson, J. J. (1966). The Senses considered as Perceptual Systems. Boston, MA: Houghton Miffin.

Google Scholar

Gibson, J. J. (1979). The Ecological Approach to Visual Perception. Boston, MA: Houghton Miffin.

Google Scholar

Giummarra, M. J., Gibson, S. J., Georgiou-Karistianis, N., and Bradshaw, J. L. (2008). Mechanisms underlying embodiment, disembodiment and loss of embodiment. Neurosci. Biobehav. Rev. 32, 143–160. doi: 10.1016/j.neubiorev.2007.07.001

PubMed Abstract | CrossRef Full Text | Google Scholar

González-Franco, M., Peck, T. C., Rodriguez-Fornells, A., and Slater, M. (2014). A threat to a virtual hand elicits motor cortex activation. Exp. Brain Res. 232, 875–887. doi: 10.1007/s00221-013-3800-1

PubMed Abstract | CrossRef Full Text | Google Scholar

Guterstam, A., Gentile, G., and Ehrsson, H. H. (2013). The invisible hand illusion: multisensory integration leads to the embodiment of a discrete volume of empty space. J. Cogn. Neurosci. 25, 1078–1099. doi: 10.1162/jocn_a_00393

PubMed Abstract | CrossRef Full Text | Google Scholar

Guterstam, A., Petkova, V. I., and Ehrsson, H. H. (2011). The illusion of owning a third arm. PLoS One 6:e17208. doi: 10.1371/journal.pone.0017208

PubMed Abstract | CrossRef Full Text | Google Scholar

Haans, A., Ijsselsteijn, W. A., and de Kort, Y. A. (2008). The effect of similarities in skin texture and hand shape on perceived ownership of a fake limb. Body Image 5, 389–394. doi: 10.1016/j.bodyim.2008.04.003

PubMed Abstract | CrossRef Full Text | Google Scholar

Heed, T., Grundler, M., Rinkleib, J., Rudzik, F. H., Collins, T., Cooke, E., et al. (2011). Visual information and rubber hand embodiment differentially affect reach-to-grasp actions. Acta Psychol. 138, 263–271. doi: 10.1016/j.actpsy.2011.07.003

PubMed Abstract | CrossRef Full Text | Google Scholar

Heidegger, M. (1962/2001). Being and Time. Translated by John Macquarrie & Edward Robinson. New York, NY: Blackwell Publishing.

Google Scholar

Hohwy, J., and Paton, B. (2010). Explaining away the body: experiences of supernaturally caused touch and touch on non-hand objects within the rubber hand illusion. PLoS One 5:e9416. doi: 10.1371/journal.pone.0009416

PubMed Abstract | CrossRef Full Text | Google Scholar

Holden, J., Riley, M. A., Gao, J., and Torre, K. (2013). Fractal analyses: statistical and methodological innovations and best practices. Front. Psychol. 4:97. doi: 10.3389/fphys.2013.00097

CrossRef Full Text | Google Scholar

Holden, J. G., Van Orden, G. C., and Turvey, M. T. (2009). Dispersion of response times reveals cognitive dynamics. Psychol. Rev. 116, 318–342. doi: 10.1037/a0014849

PubMed Abstract | CrossRef Full Text | Google Scholar

Holmes, N. P. (2012). Does tool use extend peripersonal space? a review and re-analysis. Exp. Brain Res. 218, 273–282. doi: 10.1007/s00221-012-3042-7

PubMed Abstract | CrossRef Full Text | Google Scholar

Holmes, N. P., Calvert, G. A., and Spence, C. (2004). Extending or projecting peripersonal space with tools? Multisensory interactions highlight only the distal and proximal ends of tools. Neurosci. Lett. 372, 62–67. doi: 10.1016/j.neulet.2004.09.024

PubMed Abstract | CrossRef Full Text | Google Scholar

Holmes, N. P., Calvert, G. A., and Spence, C. (2007). Tool use changes multisensory interactions in seconds: evidence from the crossmodal congruency task. Exp. Brain Res. 183, 465–476. doi: 10.1007/s00221-007-1060-7

PubMed Abstract | CrossRef Full Text | Google Scholar

Holmes, N. P., and Spence, C. (2005). Multisensory integration: space, time and superadditivity. Curr. Biol. 15, 762–764. doi: 10.1016/j.cub.2005.08.058

PubMed Abstract | CrossRef Full Text | Google Scholar

Holmes, N. P., Spence, C., Hansen, P. C., Mackay, C. E., and Calvert, G. A. (2008). The multisensory attentional consequences of tool use: a functional magnetic resonance imaging study. PLoS One 3:e3502. doi: 10.1371/journal.pone.0003502

PubMed Abstract | CrossRef Full Text | Google Scholar

Hutto, D., and Myin, E. (2013). Radicalizing Enactivism: Basic Minds without Content. Cambridge, MA: MIT Press.

Google Scholar

Ijsselsteijn, W. A., de Kort, Y. A., and Haans, A. (2006). Is this my hand I see before me? the rubber hand illusion in reality, virtual reality, and mixed reality. Presence 15, 455–464. doi: 10.1162/pres.15.4.455

CrossRef Full Text | Google Scholar

Inoue, K., Kawashima, R., Sugiura, M., Ogawa, A., Schormann, T., Zilles, K., et al. (2001). Activation in the ipsilateral posterior parietal cortex during tool use: a PET study. Neuroimage 14, 1469–1475. doi: 10.1006/nimg.2001.0942

PubMed Abstract | CrossRef Full Text | Google Scholar

Iriki, A., Tanaka, M., and Iwamura, Y. (1996). Coding of modified body schema during tool use by macaque postcentral neurones. Neuroreport 7, 2325–2330. doi: 10.1097/00001756-199610020-00010

PubMed Abstract | CrossRef Full Text | Google Scholar

Jenkinson, P. M., and Preston, C. (2015). New reflections on agency and body ownership: the moving rubber hand illusion in the mirror. Conscious. Cogn. 33, 432–442. doi: 10.1016/j.concog.2015.02.020

PubMed Abstract | CrossRef Full Text | Google Scholar

Juarrero, A. (1999). Dynamics in Action: Intentional Behavior as a Complex System. Cambridge, MA: The MIT Press.

Google Scholar

Kalckert, A., and Ehrsson, H. H. (2012). Moving a rubber hand that feels like your own: a dissociation of ownership and agency. Front. Hum. Neurosci. 6:40. doi: 10.3389/fnhum.2012.00040

PubMed Abstract | CrossRef Full Text | Google Scholar

Kalckert, A., and Ehrsson, H. H. (2014). The spatial distance rule in the moving and classical rubber hand illusions. Conscious. Cogn. 30, 118–132. doi: 10.1016/j.concog.2014.08.022

PubMed Abstract | CrossRef Full Text | Google Scholar

Kammers, M. P., de Vignemont, F. D., Verhagen, L., and Dijkerman, H. (2009a). The rubber hand illusion in action. Neuropsychologia 47, 204–211. doi: 10.1016/j.neuropsychologia.2008.07.028

PubMed Abstract | CrossRef Full Text | Google Scholar

Kammers, M. P., Verhagen, L., Dijkerman, H. C., Hogendoorn, H. D., and de Vignemont, F. (2009b). Is this hand for real? attenuation of the rubber hand illusion by transcranial magnetic stimulation over the inferior parietal lobule. J. Cogn. Neurosci. 21, 1311–1320. doi: 10.1162/jocn.2009.21095

PubMed Abstract | CrossRef Full Text | Google Scholar

Käufer, S., and Chemero, A. (2015). Phenomenology: An Introduction. Cambridge: Polity.

Google Scholar

Kennett, S., Rorden, C., Husain, M., and Driver, J. (2010). Crossmodal visual-tactile extinction: modulation by posture implicates biased competition in proprioceptively reconstructed space. J. Neuropsychol. 4, 15–32. doi: 10.1348/174866409x415942

PubMed Abstract | CrossRef Full Text | Google Scholar

Kilteni, K., and Ehrsson, H. H. (2017). Body ownership determines the attenuation of self-generated tactile sensations. Proc. Natl. Acad. Sci. U.S.A. 114, 8426–8431. doi: 10.1073/pnas.1703347114

PubMed Abstract | CrossRef Full Text | Google Scholar

Kilteni, K., Normand, J., Sanchez-Vives, M. V., and Slater, M. (2012). Extending body space in immersive virtual reality: a very long arm illusion. PLoS One 7:e40867. doi: 10.1371/journal.pone.0040867

PubMed Abstract | CrossRef Full Text | Google Scholar

Kirchhoff, M., and Kiverstein, J. (2019). Extended Consciousness and Predictive Processing: A Third Wave View. New York, NY: Routledge.

Google Scholar

Kuznetsov, N., Bonnette, S., and Riley, M. A. (2013). “Nonlinear time series methods for analyzing behavioral sequences,” in Complex Systems in Sport, et al Edn, ed. K. Davis, (London: Routledge), 83–102.

Google Scholar

Lackner, J. R., and Dizio, P. (2005). Vestibular, proprioceptive, and haptic contributions to spatial orientation. Annu. Rev. Psychol. 56, 115–147. doi: 10.1146/annurev.psych.55.090902.142023

PubMed Abstract | CrossRef Full Text | Google Scholar

Làdavas, E. (2002). Functional and dynamic properties of visual peripersonal space. Trends Cogn. Sci. 6, 17–22. doi: 10.1016/s1364-6613(00)01814-3

PubMed Abstract | CrossRef Full Text | Google Scholar

Làdavas, E., and Serino, A. (2008). Action-dependent plasticity in peripersonal space representations. Cogn. Neuropsychol. 25, 1099–1113. doi: 10.1080/02643290802359113

PubMed Abstract | CrossRef Full Text | Google Scholar

Lamb, M. J., and Chemero, A. (2013). Interaction-dominant dynamics and extended embodiment. Construct. Found. 9, 88–89.

Google Scholar

Lane, A. R., Ball, K., Smith, D. T., Schenk, T., and Ellison, A. (2011). Near and far space: understanding the neural mechanisms of spatial attention. Hum. Brain Mapp. 34, 356–366. doi: 10.1002/hbm.21433

PubMed Abstract | CrossRef Full Text | Google Scholar

Lenggenhager, B., Tadi, T., Metzinger, T., and Blanke, O. (2007). Video ergo sum: manipulating bodily self-consciousness. Science 317, 1096–1099. doi: 10.1126/science.1143439

PubMed Abstract | CrossRef Full Text | Google Scholar

Levine, M. P., and Smolak, L. (2014). The Prevention of Eating Problems And Eating Disorders: Theory, Research, and Practice. New York, NY: Psychology Press, Taylor & Francis Group.

Google Scholar

Limanowski, F., and Blankenburg, F. (2015). Network activity underlying the illusory self-attribution of a dummy arm. Hum. Brain Mapp. 36, 2284–2304. doi: 10.1002/hbm.22770

PubMed Abstract | CrossRef Full Text | Google Scholar

Lloyd, D. M. (2007). Spatial limits on referred touch to an alien limb may reflect boundaries of visuo-tactile peripersonal space surrounding the hand. Brain Cogn. 64, 104–109. doi: 10.1016/j.bandc.2006.09.013

PubMed Abstract | CrossRef Full Text | Google Scholar

Longo, M. R. (2016). “Types of body representation,” in Foundations of Embodied Cognition, Volume 1: Perceptual and Emotional Embodiment, eds Y. Coello and M. H. Fischer, (London: Routledge), 117–134.

Google Scholar

Longo, M. R., and Lourenco, S. F. (2006). On the nature of near space: effects of tool use and the transition to far space. Neuropsychologia 44, 977–981. doi: 10.1016/j.neuropsychologia.2005.09.003

PubMed Abstract | CrossRef Full Text | Google Scholar

Longo, M. R., and Lourenco, S. F. (2007). Space perception and body morphology: extent of near space scales with arm length. Exp. Brain Res. 177, 285–290. doi: 10.1007/s00221-007-0855-x

PubMed Abstract | CrossRef Full Text | Google Scholar

Longo, M. R., Schüür, F., Kammers, M. P., Tsakiris, M., and Haggard, P. (2008). What is embodiment? a psychometric approach. Cognition 107, 978–998. doi: 10.1016/j.cognition.2007.12.004

PubMed Abstract | CrossRef Full Text | Google Scholar

Longo, M. R., Schüür, F., Kammers, M. P., Tsakiris, M., and Haggard, P. (2009). Self awareness and the body image. Acta Psychol. 132, 166–172. doi: 10.1016/j.actpsy.2009.02.003

PubMed Abstract | CrossRef Full Text | Google Scholar

Lourenco, S. F., and Longo, M. R. (2009). The plasticity of near space: evidence for contraction. Cognition 112, 451–456. doi: 10.1016/j.cognition.2009.05.011

PubMed Abstract | CrossRef Full Text | Google Scholar

Maister, L., Slater, M., Sanchez-Vives, M. V., and Tsakiris, M. (2015). Changing bodies changes minds: owning another body affects social cognition. Trends Cogn. Sci. 19, 6–12. doi: 10.1016/j.tics.2014.11.001

PubMed Abstract | CrossRef Full Text | Google Scholar

Makin, T. R., Holmes, N. P., and Ehrsson, H. H. (2008a). On the other hand: dummy hands and peripersonal space. Behav. Brain Res. 191, 1–10. doi: 10.1016/j.bbr.2008.02.041

PubMed Abstract | CrossRef Full Text | Google Scholar

Makin, T. R., Holmes, N. P., and Zohary, E. (2008b). Is that near my hand? multisensory representation of peripersonal space in human intraparietal sulcus. J. Neurosci. 27, 731–740. doi: 10.1523/jneurosci.3653-06.2007

PubMed Abstract | CrossRef Full Text | Google Scholar

Marasco, P. D., Kim, K., Colgate, J. E., Peshkin, M. A., and Kuiken, T. A. (2011). Robotic touch shifts perception of embodiment to a prosthesis in targeted reinnervation amputees. Brain 134, 747–758. doi: 10.1093/brain/awq361

PubMed Abstract | CrossRef Full Text | Google Scholar

Maravita, A., Husain, M., Clarke, K., and Driver, J. (2001). Reaching with a tool extends visual– tactile interactions into far space: evidence from cross-modal extinction. Neuropsychologia 39, 580–585. doi: 10.1016/s0028-3932(00)00150-0

PubMed Abstract | CrossRef Full Text | Google Scholar

Maravita, A., and Iriki, A. (2004). Tools for the body (schema). Trends Cogn. Sci. 8, 79–86. doi: 10.1016/j.tics.2003.12.008

PubMed Abstract | CrossRef Full Text | Google Scholar

Maravita, A., Spence, C., and Driver, J. (2003). Multisensory integration and the body schema: close to hand and within reach. Curr. Biol. 13, R531–R539. doi: 10.1016/s0960-9822(03)00449-4

PubMed Abstract | CrossRef Full Text | Google Scholar

Maravita, A., Spence, C., Kennett, S., and Driver, J. (2002). Tool-use changes multimodal spatial interactions between vision and touch in normal humans. Cognition 83, B25-B34. doi: 10.1016/s0010-0277(02)00003-3

PubMed Abstract | CrossRef Full Text | Google Scholar

Marotta, A., Bombieri, F., Zampini, M., Schena, F., Dallocchio, C., Fiorio, M., et al. (2017). The moving rubber hand illusion reveals that explicit sense of agency for tapping movements is preserved in functional movement disorders. Front. Hum. Neurosci. 11:291. doi: 10.3389/fnhum.2017.00291

PubMed Abstract | CrossRef Full Text | Google Scholar

Martel, M., Cardinali, L., Bertonati, G., Jouffrais, C., Finos, L., Farnè, A., et al. (2019). Somatosensory-guided tool use modifies arm representation for action. Sci. Rep. 9, 1–14. doi: 10.1038/s41598-019-41928-1

PubMed Abstract | CrossRef Full Text | Google Scholar

Martinaud, O., Besharati, S., Jenkinson, P. M., and Fotopoulou, A. (2017). Ownership illusions in patients with body delusions: different neural profiles of visual capture and disownership. Cortex 87, 174–185. doi: 10.1016/j.cortex.2016.09.025

PubMed Abstract | CrossRef Full Text | Google Scholar

Maselli, A., and Slater, M. (2013). The building blocks of the full body ownership illusion. Front. Hum. Neurosci. 7:83. doi: 10.3389/fnhum.2013.00083

PubMed Abstract | CrossRef Full Text | Google Scholar

Menary, R. (2010). The Extended Mind. Cambridge, MA: The MIT Press.

Google Scholar

Merleau-Ponty, M. (1945/2012). Phenomenology of Perception. Tranlated by D. A. Landes. Oxford: Routledge.

Google Scholar

Michaels, C., and Carello, C. (1981). Direct Perception. Englewood Cliffs, NJ: Prentice- Hall.

Google Scholar

Miller, L. E., Longo, M. R., and Saygin, A. P. (2014). Tool morphology constrains the effects of tool use on body representations. J. Exp. Psychol. 40, 2143–2153. doi: 10.1037/a0037777

PubMed Abstract | CrossRef Full Text | Google Scholar

Moseley, G. L., and Flor, H. (2012). Targeting cortical representations in the treatment of chronic pain: a review. Neurorehabil. Neural Repair. 26, 646–652. doi: 10.1177/1545968311433209

PubMed Abstract | CrossRef Full Text | Google Scholar

Moseley, G. L., Olthof, N., Venema, A., Don, S., Wijers, M., Gallace, A., et al. (2008). Psychologically induced cooling of a specific body part caused by the illusory ownership of an artificial counterpart. Proc. Natl. Acad. Sci. 105, 13169–13173. doi: 10.1073/pnas.0803768105

PubMed Abstract | CrossRef Full Text | Google Scholar

Murray, C., and Sixsmith, J. (1999). The corporeal body in virtual reality. Ethos 27, 315–343. doi: 10.1525/eth.1999.27.3.315

CrossRef Full Text | Google Scholar

Murray, C. D. (2004). An interpretative phenomenological analysis of the embodiment of artificial limbs. Disabil. Rehabil. 26, 963–973. doi: 10.1080/09638280410001696764

PubMed Abstract | CrossRef Full Text | Google Scholar

Naito, E., Roland, P. E., and Ehrsson, H. (2002). I feel my hand moving: a new role of the primary motor cortex in somatic perception of limb movement. Neuron 36, 979–988. doi: 10.1016/s0896-6273(02)00980-7

PubMed Abstract | CrossRef Full Text | Google Scholar

Newport, R., Pearce, R., and Preston, C. (2009). Fake hands in action: embodiment and control of supernumerary limbs. Exp. Brain Res. 204, 385–395. doi: 10.1007/s00221-009-2104-y

PubMed Abstract | CrossRef Full Text | Google Scholar

Newport, R., and Preston, C. (2010). Pulling the finger off disrupts agency, embodiment and peripersonal space. Perception 39, 1296–1298. doi: 10.1068/p674

PubMed Abstract | CrossRef Full Text | Google Scholar

Noel, J. P., Pfeiffer, C., Blanke, O., and Serino, A. (2015). Peripersonal space as the space of the bodily self. Cognition 144, 49–57. doi: 10.1016/j.cognition.2015.07.012

PubMed Abstract | CrossRef Full Text | Google Scholar

Normand, J. M., Giannopoulos, E., Spanlang, B., and Slater, M. (2011). Multisensory stimulation can induce an illusion of larger belly size in immersive virtual reality. PLoS One 6:e16128. doi: 10.1371/journal.pone.0016128

PubMed Abstract | CrossRef Full Text | Google Scholar

Obayashi, S., Suhara, T., Kawabe, K., Okauchi, T., Maeda, J., Akine, Y., et al. (2001). Functional brain mapping of monkey tool use. Neuroimage 14, 853–861. doi: 10.1006/nimg.2001.0878

PubMed Abstract | CrossRef Full Text | Google Scholar

Pavani, F., Spence, C., and Driver, J. (2000). Visual capture of touch: Out-of-the-Body experiences with rubber gloves. Psychol. Sci. 11, 353–359. doi: 10.1111/1467-9280.00270

PubMed Abstract | CrossRef Full Text | Google Scholar

Pavani, F., and Zampini, M. (2007). The role of hand size in the fake-hand illusion paradigm. Perception 36, 1547–1554. doi: 10.1068/p5853

PubMed Abstract | CrossRef Full Text | Google Scholar

Pavone, E. F., Tieri, G., Rizza, G., Tidoni, E., Grisoni, L., and Aglioti, S. M. (2016). Embodying others in immersive virtual reality: electro-cortical signatures of monitoring the errors in the actions of an avatar seen from a first-person perspective. J. Neurosci. 36, 268–279. doi: 10.1523/jneurosci.0494-15.2016

CrossRef Full Text | Google Scholar

Peck, T. C., Seinfeld, S., Aglioti, S. M., and Slater, M. (2013). Putting yourself in the skin of a black avatar reduces implicit racial bias. Conscious. Cogn. 22, 779–787. doi: 10.1016/j.concog.2013.04.016

PubMed Abstract | CrossRef Full Text | Google Scholar

Peled, A., Ritsner, M., Hirschmann, S., Geva, A. B., and Modai, I. (2000). Touch feel illusion in schizophrenic patients. Biol. Psychiatr. 48, 1105–1108. doi: 10.1016/s0006-3223(00)00947-1

PubMed Abstract | CrossRef Full Text | Google Scholar

Petkova, V. I., and Ehrsson, H. H. (2008). If I were you: perceptual illusion of body swapping. PLoS One 3:e3832. doi: 10.1371/journal.pone.0003832

PubMed Abstract | CrossRef Full Text | Google Scholar

Petkova, V. I., Khoshnevis, M., and Ehrsson, H. H. (2011). The perspective matters! Multisensory integration in ego-centric reference frames determines full-body ownership. Front. Psychol. 2:35. doi: 10.3389/fpsyg.2011.00035

PubMed Abstract | CrossRef Full Text | Google Scholar

Petkova, V. I., Zetterberg, H., and Ehrsson, H. H. (2012). Rubber hands feel touch, but not in blind individuals. PLoS One 7:e35912. doi: 10.1371/journal.pone.0035912

PubMed Abstract | CrossRef Full Text | Google Scholar

Pomés, A., and Slater, M. (2013). Drift and ownership toward a distant virtual body. Front. Hum. Neurosci. 7:908. doi: 10.3389/fnhum.2013.00908

PubMed Abstract | CrossRef Full Text | Google Scholar

Pozeg, P., Palluel, E., Ronchi, R., Solcà, M., Al-Khodairy, A.-W., Jordan, X., et al. (2017). Virtual reality improves embodiment and neuropathic pain caused by spinal cord injury. Neurology 89, 1894–1903. doi: 10.1212/wnl.0000000000004585

PubMed Abstract | CrossRef Full Text | Google Scholar

Rao, I. S., and Kayser, C. (2017). Neurophysiological correlates of the rubber hand illusion in late evoked and alpha/beta band activity. Front. Hum. Neurosci. 11:377. doi: 10.3389/fnhum.2017.00377

PubMed Abstract | CrossRef Full Text | Google Scholar

Richardson, M. J., Shockley, K., Fajen, B. R., Riley, M. A., and Turvey, M. T. (2008). “Ecological psychology: six principles for an embodied-embedded approach to behavior,” in Handbook of Cognitive Science: An Embodied Approach, eds P. Calvo and T. Gomila, (San Diego, CA: Elsevier), 161–188.

Google Scholar

Riley, M. A., and Turvey, M. T. (2002). Variability and determinism in motor behavior. J. Motor Behav. 34, 99–125. doi: 10.1080/00222890209601934

PubMed Abstract | CrossRef Full Text | Google Scholar

Riley, M. A., and Van Orden, G. C. (2005). Tutorials in Contemporary Nonlinear Methods for the Behavioral Sciences. Available at: https://www.nsf.gov/pubs/2005/nsf05057/nmbs/nmbs.jsp (accessed December 5, 2019).

Google Scholar

Rohde, M., Di Luca, M., and Ernst, M. O. (2011). The rubber hand illusion: feeling of ownership and proprioceptive drift do not go hand in hand. PLoS One 6:e21659. doi: 10.1371/journal.pone.0021659

PubMed Abstract | CrossRef Full Text | Google Scholar

Rohde, M., Wold, A., Karnath, H.-O., and Ernst, M. O. (2013). The human touch: skin temperature during the rubber hand illusion in manual and automated stroking procedures. PLoS One 8:e80688. doi: 10.1371/journal.pone.0080688

PubMed Abstract | CrossRef Full Text | Google Scholar

Rossetti, Y., Rode, G., and Boisson, D. (1995). Implicit processing of somaesthetic information: a dissociation between where and how? Neuroreport 6, 506–510. doi: 10.1097/00001756-199502000-00025

PubMed Abstract | CrossRef Full Text | Google Scholar

Sanchez-Vives, M. V., Spanlang, B., Frisoli, A., Bergamasco, M., and Slater, M. (2010). Virtual hand illusion induced by visuomotor correlations. PLoS One 5:e10381. doi: 10.1371/journal.pone.0010381

PubMed Abstract | CrossRef Full Text | Google Scholar

Scandola, M., Tidoni, E., Avesani, R., Brunelli, G., Aglioti, S. M., and Moro, V. (2014). Rubber hand illusion induced by touching the face ipsilaterally to a deprived hand: evidence for plastic “somatotopic” remapping in tetraplegics. Front. Hum. Neurosci. 8:404. doi: 10.3389/fnhum.2014.00404

PubMed Abstract | CrossRef Full Text | Google Scholar

Schaefer, M., Flor, H., Heinze, H. J., and Rotte, M. (2007). Morphing the body: illusory feeling of an elongated arm affects somatosensory homunculus. Neuroimage 36, 700–705. doi: 10.1016/j.neuroimage.2007.03.046

PubMed Abstract | CrossRef Full Text | Google Scholar

Schmalzl, L., Kalckert, A., Ragnö, C., and Ehrsson, H. H. (2013). Neural correlates of the rubber hand illusion in amputees: a report of two cases. Neurocase 20, 407–420. doi: 10.1080/13554794.2013.791861

PubMed Abstract | CrossRef Full Text | Google Scholar

Schütz-Bosbach, S., Mancini, B., Aglioti, S. M., and Haggard, P. (2006). Self and other in the human motor system. Curr. Biol. 16, 1830–1834. doi: 10.1016/j.cub.2006.07.048

PubMed Abstract | CrossRef Full Text | Google Scholar

Schwoebel, J., and Coslett, H. B. (2005). Evidence for multiple, distinct representations of the human body. J. Cogn. Neurosci. 17, 543–553. doi: 10.1162/0898929053467587

PubMed Abstract | CrossRef Full Text | Google Scholar

Sengül, A., Elk, M. V., Rognini, G., Aspell, J. E., Bleuler, H., and Blanke, O. (2012). Extending the body to virtual tools using a robotic surgical interface: evidence from the crossmodal congruency task. PLoS One 7:e49473. doi: 10.1371/journal.pone.0049473

PubMed Abstract | CrossRef Full Text | Google Scholar

Sengül, A., Rognini, G., Elk, M. V., Aspell, J. E., Bleuler, H., and Blanke, O. (2013). Force feedback facilitates multisensory integration during robotic tool use. Exp. Brain Res. 227, 497–507. doi: 10.1007/s00221-013-3526-0

PubMed Abstract | CrossRef Full Text | Google Scholar

Serino, A. (2019). Peripersonal space (PPS) as a multisensory interface between the individual and the environment, defining the space of the self. Neurosci. Biobehav. Rev. 99, 138–159. doi: 10.1016/j.neubiorev.2019.01.016

PubMed Abstract | CrossRef Full Text | Google Scholar

Serino, A., Bassolino, M., Farnè, A., and Làdavas, E. (2007). Extended multisensory space in blind cane users. Psychol. Sci. 18, 642–648. doi: 10.1111/j.1467-9280.2007.01952.x

PubMed Abstract | CrossRef Full Text | Google Scholar

Shimada, S., Fukuda, K., and Hiraki, K. (2009). Rubber hand illusion under delayed visual feedback. PLoS One 4:e6185. doi: 10.1371/journal.pone.0006185

PubMed Abstract | CrossRef Full Text | Google Scholar

Slater, M., Perez-Marcos, D., Ehrsson, H. H., and Sanchez-Vives, M. V. (2008). Towards a digital body: the virtual arm illusion. Front. Hum. Neurosci. 2:6. doi: 10.3389/neuro.09.006.2008

PubMed Abstract | CrossRef Full Text | Google Scholar

Slater, M., Spanlang, B., Sanchez-Vives, M. V., and Blanke, O. (2010). First person experience of body transfer in virtual reality. PLoS One 5:e10564. doi: 10.1371/journal.pone.0010564

PubMed Abstract | CrossRef Full Text | Google Scholar

Sposito, A., Bolognini, N., Vallar, G., and Maravita, A. (2012). Extension of perceived arm length following tool-use: clues to plasticity of body metrics. Neuropsychologia 50, 2187–2194. doi: 10.1016/j.neuropsychologia.2012.05.022

PubMed Abstract | CrossRef Full Text | Google Scholar

Stephen, D. G., and Dixon, J. A. (2009). The self-organization of insight: entropy and power laws in problem solving. J. Probl. Solv. 2, 72–101. doi: 10.7771/1932-6246.1043

CrossRef Full Text | Google Scholar

Tieri, G., Gioia, A., Scandola, M., Pavone, E. F., and Aglioti, S. M. (2017). Visual appearance of a virtual upper limb modulates the temperature of the real hand: a thermal imaging study in Immersive Virtual Reality. Eur. J. Neurosci. 45, 1141–1151. doi: 10.1111/ejn.13545

PubMed Abstract | CrossRef Full Text | Google Scholar

Tieri, G., Tidoni, E., Pavone, E. F., and Aglioti, S. M. (2015b). Mere observation of body discontinuity affects perceived ownership and vicarious agency over a virtual hand. Exp. Brain Res. 233, 1247–1259. doi: 10.1007/s00221-015-4202-3

PubMed Abstract | CrossRef Full Text | Google Scholar

Tieri, G., Tidoni, E., Pavone, E. F., and Aglioti, S. M. (2015a). Body visual discontinuity affects feeling of ownership and skin conductance responses. Sci. Rep. 5:17139. doi: 10.1038/srep17139

PubMed Abstract | CrossRef Full Text | Google Scholar

Tomasino, B., Weiss, P. H., and Fink, G. R. (2012). Imagined tool-use in near and far space modulates the extra- striate body area. Neuropsychologia 50, 2467–2476. doi: 10.1016/j.neuropsychologia.2012.06.018

PubMed Abstract | CrossRef Full Text | Google Scholar

Trojan, J., Fuchs, X., Speth, S., and Diers, M. (2018). The rubber hand illusion induced by visual-thermal stimulation. Sci. Rep. 8:12417. doi: 10.1038/s41598-018-29860-2

PubMed Abstract | CrossRef Full Text | Google Scholar

Tsakiris, M. (2010). My body in the brain: a neurocognitive model of body-ownership. Neuropsychologia 48, 703–712. doi: 10.1016/j.neuropsychologia.2009.09.034

PubMed Abstract | CrossRef Full Text | Google Scholar

Tsakiris, M. (2017). The multisensory basis of the self: from body to identity to others. Quar. J. Exp. Psychol. 70, 597–609. doi: 10.1080/17470218.2016.1181768

PubMed Abstract | CrossRef Full Text | Google Scholar

Tsakiris, M., Carpenter, L., James, D., and Fotopoulou, A. (2009). Hands only illusion: multisensory integration elicits sense of ownership for body parts but not for non- corporeal objects. Exp. Brain Res. 204, 343–352. doi: 10.1007/s00221-009-2039-3

PubMed Abstract | CrossRef Full Text | Google Scholar

Tsakiris, M., and Haggard, P. (2005). The rubber hand illusion revisited: visuotactile integration and self-attribution. J. Exp. Psychol. 31, 80–91. doi: 10.1037/0096-1523.31.1.80

PubMed Abstract | CrossRef Full Text | Google Scholar

Tsakiris, M., Prabhu, G., and Haggard, P. (2006). Having a body versus moving your body: how agency structures body-ownership. Conscious. Cogn. 15, 423–432. doi: 10.1016/j.concog.2005.09.004

PubMed Abstract | CrossRef Full Text | Google Scholar

Turvey, M. T. (2019). Lectures on Perception: An Ecological Perspective. New York, NY: Routledge.

Google Scholar

Van der Hoort, B., Guterstam, A., and Ehrsson, H. H. (2011). Being barbie: the size of one’s own body determines the perceived size of the world. PLoS One 6:e20195. doi: 10.1371/journal.pone.0020195

PubMed Abstract | CrossRef Full Text | Google Scholar

Van Orden, G. C., Holden, J. G., and Turvey, M. T. (2003). Self-Organization of cognitive performance. J. Exp. Psychol. 132, 331–350.

Google Scholar

Van Orden, G. C., Holden, J. G., and Turvey, M. T. (2005). Human cognition and 1/f Scaling. J. Exp. Psychol. 134, 117–123. doi: 10.1037/0096-3445.134.1.117

PubMed Abstract | CrossRef Full Text | Google Scholar

Varela, F. J., Thompson, E., and Rosch, E. (1991). The Embodied Mind. Cambridge, MA: MIT Press.

Google Scholar

Walter, S. (2010b). Locked-in syndrome, BCI, and a confusion about embodied, embedded, extended, and enacted cognition. Neuroethics 3, 61–72. doi: 10.1007/s12152-009-9050-z

CrossRef Full Text | Google Scholar

Walter, S. (2010a). Cognitive extension: the parity argument, functionalism, and the mark of the cognitive. Synthese 177, 285–300. doi: 10.1007/s11229-010-9844-x

CrossRef Full Text | Google Scholar

Weiss, P., Marshall, J., Wunderlich, G., Tellmann, L., Halligan, P., Freund, H., et al. (2000). Neural consequences of acting in near versus far space: a physiological basis for clinical dissociations. Brain 123, 2531–2541. doi: 10.1093/brain/123.12.2531

PubMed Abstract | CrossRef Full Text | Google Scholar

Witt, J. K., Proffitt, D. R., and Epstein, W. (2005). Tool use affects perceived distance, but only when you intend to use it. J. Exp. Psychol. 31, 880–888. doi: 10.1037/0096-1523.31.5.880

PubMed Abstract | CrossRef Full Text | Google Scholar

Won, A. S., Bailenson, J., Lee, J., and Lanier, J. (2015). Homuncular flexibility in virtual reality. J. Comput. Med. Commun. 20, 241–259. doi: 10.1111/jcc4.12107

CrossRef Full Text | Google Scholar

Wrigley, P. J., Press, S. R., Gustin, S. M., Macefield, V. G., Gandevia, S. C., Cousins, M. J., et al. (2009). Neuropathic pain and primary somatosensory cortex reorganization following spinal cord injury. Pain 141, 52–59. doi: 10.1016/j.pain.2008.10.007

PubMed Abstract | CrossRef Full Text | Google Scholar

Yamamoto, S., and Kitazawa, S. (2001). Reversal of subjective temporal order due to arm crossing. Nat. Neurosci. 4, 759–765. doi: 10.1038/89559

PubMed Abstract | CrossRef Full Text | Google Scholar

Yamamoto, S., Moizumi, S., and Kitazawa, S. (2005). Referral of tactile sensation to the tips of L-shaped sticks. J. Neurophysiol. 93, 2856–2863. doi: 10.1152/jn.01015.2004

PubMed Abstract | CrossRef Full Text | Google Scholar

Zeller, D., Friston, K. J., and Classen, J. (2016). Dynamic causal modeling of touch-evoked potentials in the rubber hand illusion. Neuroimage 138, 266–273. doi: 10.1016/j.neuroimage.2016.05.065

PubMed Abstract | CrossRef Full Text | Google Scholar

Zeller, D., Gross, C., Bartsch, A., Johansen-Berg, H., and Classen, J. (2011). Ventral premotor cortex may be required for dynamic changes in the feeling of limb ownership: a lesion study. J. Neurosci. 31, 4852–4857. doi: 10.1523/jneurosci.5154-10.2011

PubMed Abstract | CrossRef Full Text | Google Scholar

Zeller, D., Litvak, V., Friston, K. J., and Classen, J. (2015). Sensory processing and the rubber hand illusion—an evoked potentials study. J. Cogn. Neurosci. 27, 573–582. doi: 10.1162/jocn_a_00705

PubMed Abstract | CrossRef Full Text | Google Scholar

Zopf, R., Harris, J. A., and Williams, M. A. (2011a). The influence of body-ownership cues on tactile sensitivity. Cogn. Neurosci. 2, 147–154. doi: 10.1080/17588928.2011.578208

PubMed Abstract | CrossRef Full Text | Google Scholar

Zopf, R., Truong, S., Finkbeiner, M., Friedman, J., and Williams, M. A. (2011b). Viewing and feeling touch modulates hand position for reaching. Neuropsychologia 49, 1287–1293. doi: 10.1016/j.neuropsychologia.2011.02.012

PubMed Abstract | CrossRef Full Text | Google Scholar

Zopf, R., Savage, G., and Williams, M. A. (2010). Crossmodal congruency measures of lateral distance effects on the rubber hand illusion. Neuropsychologia 48, 713–725. doi: 10.1016/j.neuropsychologia.2009.10.028

PubMed Abstract | CrossRef Full Text | Google Scholar

Keywords: embodiment, prosthethics, virtual reality, body schema, body image

Citation: Schettler A, Raja V and Anderson ML (2019) The Embodiment of Objects: Review, Analysis, and Future Directions. Front. Neurosci. 13:1332. doi: 10.3389/fnins.2019.01332

Received: 02 August 2019; Accepted: 26 November 2019;
Published: 13 December 2019.

Edited by:

Mariella Pazzaglia, Sapienza University of Rome, Italy

Reviewed by:

Carlotta Fossataro, University of Turin, Italy
Silvia Serino, Lausanne University Hospital (CHUV), Switzerland

Copyright © 2019 Schettler, Raja and Anderson. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Aubrie Schettler, aschettl@uwo.ca

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.