- 1Department of Psychological Sciences, Birkbeck University of London, London, United Kingdom
- 2School of Psychology, University of Kent, Canterbury, United Kingdom
- 3Institute of Psychology, Otto von Guericke University Magdeburg, Magdeburg, Germany
- 4Center for Behavioral Brain Sciences, Magdeburg, Germany
- 5Department of Behavioral Neurology, Leibniz Institute for Neurobiology, Magdeburg, Germany
The processing of touch depends of multiple factors, such as the properties of the skin and type of receptors stimulated, as well as features related to the actual configuration and shape of the body itself. A large body of research has focused on the effect that the nature of the stimuli has on tactile processing. Less research, however, has focused on features beyond the nature of the touch. In this review, we focus on some features related to the body that have been investigated for less time and in a more fragmented way. These include the symmetrical quality of the two sides of the body, the postural configuration of the body, as well as the size and shape of different body parts. We will describe what we consider three key aspects: (1) how and at which stages tactile information is integrated between different parts and sides of the body; (2) how tactile signals are integrated with online and stored postural configurations of the body, regarded as priors; (3) and how tactile signals are integrated with representations of body size and shape. Here, we describe how these different body dimensions affect integration of tactile information as well as guide motor behavior by integrating them in a single model of tactile processing. We review a wide range of neuropsychological, neuroimaging, and neurophysiological data and suggest a revised model of tactile integration on the basis of the one proposed previously by Longo et al.
Introduction
There are multiple factors that determine how tactile stimuli on our body are processed to produce coherent tactile experiences and guide motor behavior. A large body of research over the past decades has focused on the effects that direct changes in the nature of the stimuli, such as texture (Johnson and Hsiao, 1992), inter-stimuli delays (Craig, 1983), duration (Gescheider and Migel, 1995), frequency (Gescheider et al., 2002), and intensity (Craig, 1974), have on the somatosensory response. Less research, however, has focused on body features that critically affect tactile processing beyond the physical parameters of the touch. These features include the size, shape, and spatial configuration of the body part stimulated, as well as the integration across different parts and sides of the body. In this review, we will focus on these features and describe: (1) how and at which stages tactile information is integrated between different parts and sides of the body; (2) how tactile signals are integrated with online and stored postural configurations of the body and/or locations in space; and (3) how tactile signals are integrated with stored models of body size and shape. We will describe how these different body dimensions affect integration of tactile information to produce a coherent representation of touch and perception of the body as an integrated whole.
Several years ago, two of us proposed a model of somatosensory information processing (Longo et al., 2010). The central premise of this model was that the processing of tactile information goes beyond primary somatosensation, by integrating immediate sensory signals with stored representations of the body. This type of higher order somatosensory processing, or somatoperception, contributes to somatic perceptual constancy, providing a coherent tactile percept on the body and contributing to the formation of the bodily self. In this model, we described how information from the body surface is remapped into an egocentric reference frame, how information about the shape and size of the body interacts with tactile processing, and the role that exteroceptive (i.e., perception of objects in the external world through their contact with the body) and interoceptive perception (i.e., percepts about the nature and state of the body itself) has in tactile perception. As described in the original papers (Longo et al., 2010, 2015b), the model is consistent with a wide range of neuropsychological, neuroimaging, and neurophysiological data.
At the core of this model is the claim that many aspects of higher level perceptions about somatosensory stimuli require that sensory signals be integrated with stored representations about the body itself. Specifically, Longo et al. (2010) postulated three distinct mental body representations: the superficial schema, the postural schema, and the body model. The superficial and postural schemas were first postulated by Head and Holmes (1911) on the basis of their studies of brain-damaged patients. One group of patients could detect that they had been touched, but could not perceive where on their skin the touch had been applied. Another group of patients could perceive the location of touch, but could not tell where their affected limb was in space when they could not see it. Head and Holmes postulated the existence of the superficial and postural schemas to account for the impairments of these two groups of patients, respectively. In the model of Longo et al. (2010), the superficial schema is described as a mapping between locations within primary somatotopic maps and locations on the skin surface. The postural schema, in contrast, is a more dynamic representation of current body posture (i.e., joint angles), incorporating both afferent proprioceptive signals and efferent copies of motor commands. Finally, Longo et al. (2010) proposed a third representation of the metric properties (i.e., size and shape) of the body, which they called the body model.
In this paper, we address some further factors, which were not addressed by the model of Longo et al. (2010). A first aspect is the fact that the body is bilaterally symmetric, with homologous locations on the right and left sides of the body. A second aspect is the use of prior locations and stored postural configurations of the body when localizing touch. Here, we attempt to integrate laterality into their model as well as the use of prior information, with the aim of describing how touch is processed given the duality of the body (i.e., left and right side) and brain structures (i.e., left and right hemispheres), which goes hand in hand with the perception of the body as a single unit. Finally, we review recent advances in understanding the integration of touch and higher level representations of body size and shape, an issue at the core of the model.
Integration of Tactile Information between the Two Sides of the Body
Coordination between the two hemispheres is paramount for perception and motor control of the body. Indeed, early processing of tactile signals occurring on the two sides of the body is critical to perform appropriate goal-directed bimanual motor tasks. This notion seems to clash with the classical view that unilateral tactile stimuli are represented only in the contralateral primary somatosensory cortex (SI) (Penfield and Boldrey, 1937; Nelson and Chen, 2008). Indeed, the somatosensory and motor systems require continuous and sudden switches between lateralized and joint interhemispheric processing. Such processing includes the execution of simple actions, as well as more complex goal-directed motor behaviors. The stage of tactile sensory processing at which the interhemispheric transfer of tactile information occurs is still matter of debate (Allison, et al., 1989; Kanno, et al., 2003; Hlushchuk and Hari, 2006; Sutherland, 2006; Tommerdahl et al., 2006; Jung et al., 2012; Tamè, et al., 2016). In this section, we will describe some recent evidence in humans suggesting an early interhemispheric integration of tactile signals between the two hemispheres, possibly serving the execution of appropriate motor behavior.
Behavioral Evidence of Tactile Interhemispheric Communication in Healthy Subjects
The first stage of bilateral integration of tactile information, at cortical level, is generally thought to occur in brain areas beyond the primary somatosensory cortex (SI; Eickhoff et al., 2010); however, recent evidence have shown that SI contributes to such a processing (Kanno et al., 2004; Tan, et al., 2004; Tommerdahl et al., 2006; Tamè et al., 2012). In macaques, bilateral receptive fields have been described as early as somatosensory area 2 (Iwamura et al., 1994, 2002), an area considered to be the homologue of Brodmann area 2 (BA 2) of human primary somatosensory cortex. Furthermore, interhemispheric interactions have been observed for stimuli presented to both paws, even in the core area of SI (area 3b) of owl monkeys (Lipton et al., 2006; Reed et al., 2010, 2011).
In humans, there is growing evidence about how and when this exchange of tactile information between the two hemispheres is likely to occur (Tamè et al., 2016). For instance, Tamè and colleagues developed a paradigm of double simultaneous tactile stimulation (DSS; Tamè et al., 2011, 2013). In this study, participants were instructed to detect the presence of a tactile stimulus on a target finger. Depending on the condition, the target finger was stimulated in isolation or concurrently with another finger (i.e., masker finger). The masker was a stimulus on a finger of the same or a different hand (i.e., index and middle fingers of both hands). In accordance with previous literature, results showed that when a masker was present there was an interference effect regardless of the stimulated hand. However, critically the amount of interference varied as a function of the stimulated finger rather than the hand (i.e., which hemibody was touched; see Figure 1). The same interference was present when the non-homologous finger, with respect to the target, was the masker regardless of the hand. By contrast, such interference was significantly reduced when the masker was the homologous finger of the other hand. Therefore, the information is differently processed for homologous body parts (compared to non-homologous), as if they were coming from the same side of the body (for similar evidence on fingers homology interactions across side using a different paradigm, see Rusconi et al., 2014). This somatotopic organization provides indirect evidence that SI is involved in the side integration processing of touch. Such integration is altered when the spatial relationships between the hands/fingers change (Tamè et al., 2011). These last findings are in agreement with those reported by Haggard et al. (2006), who showed that under tactile stimulation, identification of the hand is affected by changes in hand posture, whereas this is not the case for the identification of the finger. Specifically, these authors suggested that tactile detection and finger identification occur at a somatotopic representational level, whereas hand identification occurs at a higher level in which postural information are taken into account. The role of the postural configuration in tactile processing will be widely discussed in the next section.
Figure 1. Spatial coding of touch at the fingers. Data retrieved from Tamè et al. (2011) study in which participants performed a speeded go-no-go task to indicate whether the target finger had been stimulated or not. Across conditions, the target finger was presented alone or concurrently with a masker (double simultaneous stimulation, DSS) on another finger (i.e., other finger of the same hand, homologous finger of the opposite hand, non-homologous finger of the opposite hand). Moreover, in different blocks, participants assumed different postures (i.e., hands palm down or hand palm up). Unfilled circles: Stimulation at the target finger; filled black circles: stimulation at the non-target finger. Bar plots show percent errors as a function of stimulation condition and hands’ posture. Error bars represent the standard error of the mean (±SEM). Adapted from Tamè et al. (2011). © 2011 by Elsevier. Permission for the use of the image has been obtained from the Elsevier.
Neuroimaging Evidence of Tactile Interhemispheric Communication in Healthy Subjects
Furthermore, using functional magnetic resonance imaging (fMRI), Tamè et al. (2012) identified the neural bases of bilateral integration of touch on homologous and non-homologous fingers of the two hands. In particular, Tamè and colleagues used an fMRI tactile adaptation paradigm in which pairs of vibrotactile stimuli were delivered on the left and right index and middle fingers. The adaptation paradigm relies on the reduced response of certain neurons that results from the repeated presentation of a specific feature to which these neurons are selective. On this basis, Tamè et al. (2012) hypothesized that if there are neurons that have finger-specific selectivity (i.e., index and middle fingers) a greater adaptation should emerge when the index finger (i.e., same finger) is stimulated twice compared to when different fingers are stimulated (i.e., index and middle fingers). They expected that such a pattern should emerge in SI, which is known to hold somatotopic representations. Critically, if SI is also capable of integrating stimuli that come from the two sides of the body, such a pattern should be present regardless of the side of stimulation (i.e., fingers of the left and right hand). Tamè et al. (2012) found that BOLD response was indeed greatly reduced in SI, as well as in SII, when the same finger was stimulated twice (index-index) compared to when different fingers were stimulated (middle-index), both when stimuli were delivered on the same and different hands. This result proved that SI can integrate tactile stimuli coming from the two sides of the body. The most likely subarea(s) of SI responsible for mediating such a processing can be identified as areas BA1 and BA2. Indeed, using the SPM (Statistical Parametric Mapping) anatomy toolbox, Tamè et al., 2012 identified the origin of their BOLD response in such areas. This is also compatible with studies on monkeys which showed the presence of bilateral receptive fields in area 2 (Iwamura et al., 2002). In order to overcome the limited temporal resolution of fMRI, in a subsequent study, Tamè and colleagues used a magnetoencephalography (MEG) adaptation paradigm to investigate whether the integration of bilateral tactile stimuli in SI occurred at early or late stages of tactile processing (Tamè et al., 2015). The results showed that when tactile stimuli were delivered on different hands, neural responses were somatotopically constrained, being smaller for stimulation of homologous than non-homologous fingers. Importantly, neural responses of the tactile stimuli of the two sides of the body interacted in SI at short delays (i.e., 25 ms). This is most likely due to the fact that the temporal integration window in SI is short (Mauguière et al., 1997) and long in SII (Wühle et al., 2011), suggesting that selective interaction for short delays is likely to occur within SI, rather than deriving from modulatory effects from higher level brain areas. Therefore, this pattern of results provides substantial evidence that integration of bilateral tactile stimuli on the hands cannot solely derive from higher stages of the tactile representation processing (i.e., SII and beyond) as previously suggested by other reports (Jung et al., 2012; Chung et al., 2014). The discrepancy between these results and some previous studies can be ascribed to different factors. A first possibility is that Tamè et al.’s (2015) adaptation approach has a greater sensitivity to detect changes in the neural activity in the somatosensory cortex under bilateral stimulation (Tamè et al., 2016). Indeed, this is not a trivial problem given the overwhelming response generated in the contralateral hemisphere following unilateral tactile stimulation. Another possibility, not mutually exclusive with the one just described, is the different type and locus of stimulation they used in their study compared to other works. Tamè et al. (2015) used a mechanical piezo tactile stimulator (i.e., a matrix of 2 × 5 rods; 1 mm in diameter) applied on the first phalange of the index and middle fingers for 12 ms. Instead, Cheng et al. (2014) stimulated the right index finger using a band-type MR-compatible device that pressed the whole ventral skin surface of the finger for 3 seconds, a rather long stimulation compared to Tamè et al., 2015. Moreover, Jung et al. (2012) used constant-current square-wave pulse stimulation with a very short duration (i.e., 0.2 ms), though they stimulated the median nerve of both hands at the level of the wrist, rather than the fingers as Tamè et al. (2015) did.
Overall, this result suggests that tactile stimuli from the two sides of the body (i.e., fingers) interact at an early stage of the tactile representation processing in the primary somatosensory cortex, most likely through transcallosal pathways which connect SI in the two hemispheres (see also the graphical representation of the transcallosal pathways model, Figure 3 in Tamè et al., 2016).
Sensorimotor Interhemispheric Communication in Healthy Subjects
A recent study by Tamè and Longo (2015) provided behavioral evidence of the role of topographical organization of callosal connections in the integration of sensorimotor (i.e., touch) stimuli across the two sides of the body. Using a classical behavioral paradigm to quantify sensorimotor transfer between hemispheres, i.e., the Poffenberger paradigm (Poffenberger, 1912), the study revealed a modulation of the sensorimotor interhemispheric integration time as a function of the body part stimulated. The Poffenberger paradigm relies on the logic that sensorimotor information is integrated and processed within the same hemisphere when a motor effector and the sensory signal are on the same side of the body (uncrossed). This behavioral paradigm is based on the fact that people respond faster (lower reaction times: RTs) when sensory stimuli are presented in the hemifield (for visual or auditory stimuli) or hemibody (for tactile stimuli) ipsilateral to the hand used to respond (i.e., sensory stimulus and motor response occur in the same hemisphere: uncrossed) than contralateral (sensory stimulus and motor response occur in different hemispheres: i.e., crossed). Poffenberger proposed that the time required for signals to transfer between the two cerebral hemispheres is reflected by the crossed-uncrossed difference (CUD) (Poffenberger, 1912; Marzi, 1999). By contrast, if sensory input and motor effector belong to different sides of the body, the information has to be integrated across hemispheres (crossed). In their study, the authors showed that the crossed-uncrossed difference in processing time was larger on the finger (2.6 ms) and forearm (1.8 ms) than on the forehead (0.9 ms; Tamè and Longo, 2015). The callosal connections and density of bilateral receptive fields (RFs) are consistent with such temporal difference. Indeed, it has been shown that regions that represent the periphery of body have less dense callosal connections compared to regions that represent the center (Pandya and Vignolo, 1969; Caminiti and Sbriccoli, 1985; Iwamura et al., 2001). This result suggests that the interhemispheric integration of sensorimotor stimuli, at least in the tactile domain, varies as a function of the strength of callosal connections of the body parts (Tamè and Longo, 2015). Interestingly, the cost that is paid when processing a stimulus that is on the contralateral side with respect to the effector can be vanished when touch is delivered on a seen hand. Therefore, the interhemispheric integration of tactile-motor responses can be improved by vision of the body (cf. Tamè et al., 2017a). A question that is interesting to ask is, which are the possible mechanisms that can account for this result? A first possibility is that participant’s performance is enhanced by improving their motor performance when seeing the hand. Indeed, it has been shown that when participants have to perform a goal-directed action, seeing their own hand starting point enhances their performance in the motor task (Prablanc et al., 1979; Rossetti et al., 1994; Blanchard et al., 2013). Similarly, another study has shown that manual responses are primed by the vision of the participant’s own hand (Longo and Haggard, 2009). A second possibility is that some attentional mechanisms are mediating such effect. Indeed, when participants see their own hand, a facilitatory effect occurs, which improves the processing of spatial tactile information selection on the body and/or attenuates the conflictual response coding between the stimulus and effector when they belonged to different body sides (Pierson et al., 1991). Note that these two cases may not be mutually exclusive. The neural substrate of such a processing is unclear; therefore, future studies should try to provide empirical evidence to define such mechanisms. Having said that, however, we know that when non-informative vision of the body is present participants give faster responses to touch compared to when vision of the body is absent, a phenomenon named “visual enhancement of touch” (VET; Tipper et al., 1998; Kennett et al., 2001). The neural correlates of such effect are thought to derive from a multisensory modulatory effect from the parietal cortex (Ro et al., 2004) where there are bimodal neurons (Graziano et al., 1994) that preactivate the somatosensory cortex improving tactile performance. Alternatively, in the study of Tamè et al. (2017a), the primary somatosensory cortex may have processed such information through a coupling with the visual areas. Indeed, it has been suggested that the “low-level” sensory areas may be multisensory in nature (Ghazanfar and Schroeder, 2006; Macaluso, 2006; Bruno and Pavani, 2018; Convento et al., 2018; Holmes and Tamè, 2018). However, the effect reported by Tamè and colleagues (Tamè et al., 2017a) cannot be solely explained by such a perceptual mechanism, given that they found faster responses to touch when vision of the body was present only in the contralateral hemisphere, i.e., stimulus and effector on different sides of the body, but not in the ipsilateral. Therefore, further studies are needed to clarify the mechanisms as well as the neural correlates of the improvement of interhemispheric integration of tactile-motor responses by vision of the body possibly through the integration of the perceptual and motor perspectives.
Moreover, other research has demonstrated that task demands can modulate tactile perception and processing as well as brain areas involved (e.g., Pritchett et al., 2012; Romo et al., 2012; Tamè and Holmes, 2016). In particular, relevant to the present context, finger-specificity interactions for tactile stimuli delivered on the two sides of the body are present only when complex tactile tasks (i.e., tactile detection in a go-no-go context, tactile localization, and discrimination) have to be accomplished (e.g., Tamè et al., 2011, 2017c; Dempsey-Jones et al., 2015), but not when simpler tactile tasks (i.e., tactile detection in a two-intervals force choice design) have to be solved (e.g., Tamè et al., 2014). Indeed, in the latter case, Tamè et al. (2014) showed that tactile interference is the same regardless of the stimulated fingers of the two hands (Tamè et al., 2014). Therefore, the topographic organization in the bilateral interaction is modulated by the specific task demands (Tamè et al., 2016).
Neuropsychological Evidence of Tactile and Motor Interhemispheric Communication
Sensory interhemispheric communication has also been studied in brain-damaged patients. A typical neuropsychological example of bilateral integration is patients with tactile extinction. Such individuals are perfectly capable of detecting a single tactile stimulus on one or the other side of the body. However, when two tactile stimuli are delivered simultaneously on the two body sides, patient fail to report the contralateral stimulus with respect to the locus of the lesion (Bender, 1945). Other neuropsychological examples are provided by mislocalization or reduplication phenomena. Mislocalization of touch across body sides has been termed allochiria (Obersteiner, 1881), whereas reduplication has been termed synchiria (Jones, 1908). Arm amputees and brain-damaged patients with hemiparesis and hemisensory loss are cases in which allochiria has been described (Bisiach and Berti, 1995) and in which these individuals can report contralateral referral of tactile sensations to the phantom body part (Ramachandran et al., 1995) or to the hand rendered anesthetic by stroke (Sathian, 2000).
Medina and Rapp (2008) described a case of tactile synchiria in which an individual who suffered from a left frontoparietal damage experienced bilateral tactile sensations after unilateral stimulation. The authors ascribed this effect primarily to a deficit in the inhibitory mechanisms that, in healthy individuals, naturally suppress the bilateral percept. This interesting interpretation would support the notion that unilateral tactile stimulation is capable to produce signals in both hemispheres.
Other conditions in which tactile referral to other body parts emerges are provided by patients who show mirror movements across homologous body parts. For instance, Farmer et al. (1990) studied a patient who suffered from the Klippel-Feil syndrome, a skeletal abnormality that is typically associated with mirror movements of the hand muscles (Bauman, 1932), in which voluntary activation of a muscle is replicated by an identical involuntary movement in the homologous muscle of the opposite hand. Interestingly, the authors found that unilateral electrical stimulation of the index finger produces an excitatory response in the stimulated side as well as a bilateral excitatory response approximately equal size and latency, whereas in the healthy subjects such a response was only present in the stimulated side (Farmer et al., 1990). Compatible with the idea of similarity between homologous parts of the two sides of the body, a recent study investigating the contribution of proprioceptive signals from the two sides of the body in the control of joint movements suggests the existence of a control programme that is common and uses proprioceptive information from the same joints of the two sides of the body (Han et al., 2013).
Based on these findings, Tamè et al. (2016) suggested that tactile information is integrated through transcallosal pathways connecting SI of the two hemispheres. Here, we aim to integrate this proposal into the model of somatoperceptual information processing developed by Longo et al. (2010; 2015b). In particular, we suggest that afferent tactile inputs from the two sides of the body reach Brodmann (BA) areas 3a and 3b of the contralateral primary somatosensory cortex, then continue to areas 1 and 2 – which also receive direct inputs from the thalamus – where the signals between the two sides of the body are integrated. At this point, tactile laterality is communicated to other brain areas within (i.e., 3a and 3b) and beyond (parietal areas as well as motor and premotor cortices) SI. Such integration process can have an important advantage. Indeed, it would be inefficient to maintain double representations of each body part along the whole tactile processing pathway, given that the structure of the body is homologous on either side of the body midline. Therefore, at higher level representation stages, beyond somatosensation using Longo et al.’s (2010) nomenclature, tactile inputs are processed using a single body model, which does not distinguish between the left and right body side.
The presence of a single body representation, for both sides of the body, is further suggested by neuropsychological evidence in patients suffering from left parietal lesions. For instance, it has been proposed that the body structural representation (BSR) is a critical component in mediating the knowledge about the spatial configuration of bodies. This notion relies on the fact that damage of such a representation results in conditions such as autotopagnosia (Ogden, 1985; Sirigu et al., 1991) and finger agnosia (Kinsbourne and Warrington, 1962). Studies of neurological patients (Schwoebel and Coslett, 2005) and healthy adults (Felician et al., 2004; Corradi-Dell’Acqua et al., 2009; Rusconi et al., 2014) provide evidence that the bilateral parietal cortex may mediate the structural representations of the body. A study by Rusconi and colleagues, using a bi-manual version of the in-between task (i.e., participants estimate the number of unstimulated fingers between two touched fingers), suggests that the left and right posterior parietal cortices contribute to the on-line sensorimotor representations (Pisella et al., 2000). Instead, they suggest that the connections between the left anteromedial inferior parietal lobe (a-mIPL) and the precuneus (PCN) provide the core substrate of an explicit bilateral BSR for the fingers that when disrupted can produce the typical symptoms of finger agnosia (Rusconi et al., 2014). Therefore, this study supports the notion of the presence of a single body model as a lateralized neural structure provides information about the representation of the body parts in space relative to each other that applies to the two sides of the body. Similarly, patients who suffer from synchiria are not able to distinguish anymore which is the side from where the tactile input is coming from, given that they perceive the sensation as occurring on both sides (Jones, 1908).
Furthermore, the study by Han et al. (2013), which we described above, may suggest that a similar integration flow is occurring also for the proprioceptive signals, though further evidence is needed to assess it. Indeed, proprioceptive signals for the control of joint movements may be controlled by a common programme that is the same for the left and right sides of the body. Such a possibility is compatible with the idea that tactile inputs are processed using a single body model, which does not distinguish between the two sides of the body.
Overall, the psychophysical, neurophysiological, neuroimaging, and neuropsychological evidence we described suggest that integration of the tactile signal between the two sides of the body – i.e., hands – is likely to occur at early stages of the tactile representation processing within the primary somatosensory cortex as depicted in Figure 2 (for an extensive review on this topic, see Tamè et al., 2016). Therefore, the afferent flow of tactile information from the thalamus reaches BA areas 3a and 3b of SI of the contralateral hemisphere with respect to the locus of stimulation who themselves project to areas 1 and 2 – which also have direct inflow of information from the thalamus. We propose that the side integration occurs in areas 1/2 of SI through transcallosal connections as shown by the neuroimaging studies in humans we described (Tamè et al., 2012, 2015; for a review see Tamè et al., 2016). Following this process, information about tactile laterality is communicated to other brain areas within SI (i.e., 3a, 3b), parietal areas, as well as the motor and premotor cortices (Sutherland, 2006). We do not have specific prediction about the nature of such a signal, i.e., excitatory or inhibitory, which most likely depends on the specific task demands. Future studies should focus on trying to provide further empirical evidence that can possibly support/rectify or reject this hypothesis. We believe that a sensitive approach to pursue this goal can be to perform a series of tactile tasks with different levels of complexity that involve bilateral tactile stimulation of the body as well as require side-dependent or independent representation of the body. Ideally, such approach should be performed in combination with the state-of-the-art neuroimaging techniques such as, for instance, fMRI (where in the brain this is occurring), EEG (when is occurring), and TMS.
Figure 2. Side integration model. A graphical model of tactile laterality information processing – i.e., Side Integration Model, highlighting the role of areas 1 and 2 in the primary somatosensory cortex in the integration of the lateralized tactile inputs from the two sides of the body. Red lines depict the primarily pathways of information flow coming from the left body side from tactile and proprioceptive afference, whereas green lines depict information coming from the right body side. Gradient line depicts the integration of the inputs from the two sides of the body through the corpus callosum, whereas the dashed lines depict the information flow including the body laterality towards the other areas within the primary somatosensory cortex and beyond. Inputs are depicted as diamond shapes and cortical brain areas as circles.
Integration of Tactile Information with Posture
The previous section has dealt with the integration across body sides, explicitly neglecting the role that posture has on tactile processing. However, even in tasks such as the ones reported so far, in which the goal is to report the exact finger that has been stimulated, proprioceptive information would still play a fundamental role. This is so, as localizing touch on a body surface is not by itself sufficient to interact with the environment (Driver and Spence, 1998). As we move, our bodies and limbs change position, and the relative location of each touch varies with respect to the body and other objects in the environment. It is because of this countless combination of tactile and proprioceptive signals, each indicating different locations in external space, that the brain needs to consider posture when processing touch. This integration allows representing touch beyond skin space, i.e., in an external reference frame, making it available for goal-directed actions (Driver and Spence, 1998; Yamamoto and Kitazawa, 2001). There is now a consensus in the literature that this integrative process of tactile remapping occurs by default, weighting each reference frame accordingly to task demands, even in situations where postural integration is unnecessary (Azañón and Soto-Faraco, 2008a; Azañón et al., 2010a; Badde et al., 2015; Heed et al., 2015).
In the present section, we will focus on this integration and describe evidence suggesting not only the integration of touch and online proprioceptive signals but also between touch and a priori information regarding specific locations in space (i.e., spatial priors) and/or canonical postural representations (i.e., prototypical postural configurations). These prior configurations or locations in space might enable faster motor responses to spatial locations where the occurrence of touch is more probable, allowing faster integration with other modalities, for instance, to avoid threating stimuli.
The Role of Vision and Development in Tactile Spatial Perception
Studies of children provide evidence that the process of tactile remapping is acquired during development, probably through active interaction with the environment (Bremner et al., 2008a). Tactile remapping develops with age (Bremner et al., 2008b; Pagel et al., 2009; Begum et al., 2014; Rigato et al., 2014), it is not present in infants younger than 6–10 months (Bremner et al., 2008b; Rigato et al., 2014; Begum Ali et al., 2015), and it has been associated to the ability to perform the first reaches to objects across the body midline, which suggest a tight relation with experience (Bremner et al., 2008a; Rigato et al., 2014). Furthermore, studies of the congenitally blind provide further support of the role of early visual experience in the processing of tactile stimuli later in life (Röder et al., 2004). For instance, congenitally blind individuals, who have never experienced visual input, do not show a detriment in tactile localization performance when the hands are crossed as compared to uncrossed (Röder et al., 2004; Collignon et al., 2009). This is not the case, however, for sighted participants or people who have become blind later in life, even after many years of having lost sight: performance with hands crossed is largely impaired as compared to uncrossed, even in situations where posture is irrelevant (Röder et al., 2004). This suggests that extensive visual experience during the first years of life leads to a default encoding of touch in terms of external space, even in cases where taking posture into account is detrimental. In support to this idea, the deprivation of visual input during the first years of life, by congenital dense bilateral cataracts in humans, hinders the normal development of a default remapping of touch in external space (Ley et al., 2013; Azañón et al., 2018).
Through acting in the world, sighted individuals are exposed to continuous sensorimotor contingencies across signals from the various modalities. Tactile spatial perception, thus, might therefore emerge as the repeatedly experienced correlation of specific activity of skin receptors with proprioceptive and visual information about limb position and the object touching the skin (Heed et al., 2015). This idea comes across clearly in Nissen et al. (1951), where a chimpanzee was raised from birth with pads covering arms and legs. These pads allowed the chimpanzee to move but prevented climbing and any manipulative behavior. The lack of opportunity for manipulation and for association of visual with tactile-kinesthetic sensations compromised to large extent basic tactile orienting responses later in life, such as orienting the head to the location of single touches presented to either hand. This suggests a large degree of impairment in basic tactile spatial processing after sensorimotor deprivation.
Spatial Priors and/or Canonical Postural Representations
Under a framework in which tactile spatial perception emerges through active exploration with the environment, it is plausible that with experience, initially uncorrelated distributions of locations in space across tactile, proprioceptive, and visual signals become correlated during development. For instance, given the morphology and physical constraints of the arm, touches on the right hand would occur more often on the right side and around the center of the body, with respect to the body midline. This frequent co-occurrence of sensory signals in particular locations of space might promote the emergence of visual spatial priors, serving as reference points for localization of tactile events, analogous to the use of spatial prototypes, or Bayesian priors in other forms of spatial representation (Huttenlocher et al., 1991; Körding and Wolpert, 2004). Similarly, frequent occurrence of touch while adopting particular body configurations might promote the emergence of proprioceptive canonical postures (i.e., prototypical postural configurations).
Note that spatial priors and canonical proprioceptive configurations could produce similar behavioral effects but correspond to two separate concepts. Spatial priors, as defined in this review, do not require stored proprioceptive information, but stored representations about the most plausible locations of touch in visual space (e.g., touches on the right hand would occur more often on the right side). To our knowledge, this is the first time, the concept of spatial prior, as defined in visual space, has been linked to tactile remapping. The concept of canonical posture, more widespread than the concept of spatial prior in the literature of remapping (Yamamoto and Kitazawa, 2001; Azañón and Soto-Faraco, 2008a; Bremner et al., 2008a,b; Longo et al., 2010), assumes the existence of stored proprioceptive representations, which contain the most plausible body configurations for a given touch (i.e., for a touch on the hand, the canonical configuration assumes uncrossed arms).
The existence of spatial priors is clear in vision. For instance, it has been shown that memories of spatial locations are biased towards particular locations of space in a highly stereotyped manner and across individuals. For instance, when recalling the location of a dot inside a circle, participants’ responses are biased towards the centroids of each quadrant (Huttenlocher et al., 1991, 2004). A widespread assumption from this type of result is that by integrating the memory for the actual stimulus with categorical information about where stimuli are expected to be, perceptual accuracy can be increased, though at the expense of introducing systematic bias (Cheng et al., 2007). Similarly, spatial priors in touch might provide accurate and faster tactile localization performance, pulling in nearby stimuli (as shown for visual priors), but also increase errors when large mismatches occur between the spatial prior (defined in visual space) and online tactile-proprioceptive signals. This could explain why crossing the hands produce more tactile localization errors than when the hands are at its anatomical and, therefore, expected location (see Figure 3D; Yamamoto and Kitazawa, 2001; Shore et al., 2002).
Figure 3. The use of prior information in tactile spatial localization. (A) Data retrieved from Azañón and Soto-Faraco (2008a) study, in which participants were asked to judge as quickly as possible the position of a light flash in the vertical dimension (top-bottom), irrespective of the side of presentation and location of an irrelevant tactile cue. (B) At short cue-target intervals (<60), with arms crossed (red line), targets were faster in opposite cue-target side trials than in same side trials. The pattern reversed after cue-target intervals of about 200 ms, so that tactile cues produced a facilitation of targets presented at the same external location. No differences across intervals were found with uncrossed hands (black line). (C) Data were retrieved from Overvliet et al.’s (2011) study, where participants were asked to direct saccades to a tactile stimulus at the ring finger of one of the two hands, which could be either crossed or uncrossed. Saccades to tactile stimuli when the hands were crossed (right-most panel) were sometimes initiated to the wrong direction and then corrected in-flight, resulting in a turn-around saccade. Adapted from Overvliet et al. (2011). © 2011 by Elsevier. Permission for the use of the image has been obtained from the Elsevier. (D) Figure modified from Heed and Azañón (2014). Typical single participant results of uncrossed and crossed hands temporal order judgment. In this task, two touches are presented at different stimulus onset asynchronies (SOA), and participants are required to move the finger that has been stimulated first, with no time restrain. With uncrossed hands (black line), the psychophysical curve is steeper than with crossed hands (red line), indicating an advantage in performance for the uncrossed posture. The inset illustrates the just noticeable difference (JND) for uncrossed and crossed postures.
In light with the idea that frequent co-occurrence of sensory signals can lead to the establishment of priors, Azañón et al. (2015) have recently shown that repetition of touch in the same crossed posture, even if unattended, can lead to an improvement in tactile localization, which increases with respect to the number of preceding trials. These results hence confirm that recent tactile-proprioceptive co-occurrences can influence future tactile perception. Furthermore, the authors did not find evidence of a general improvement across the course of the experiment, as performance with hands crossed returned to initial levels of impairment every time posture changed (i.e., from crossed, to uncrossed and back). This detriment in performance following changes in posture might suggest that the brain initializes a fixed, default localization process with every new crossed posture, assuming that touches are located at the anatomical side. Thus, few co-occurrences along the time of an experiment cannot override long-life priors.
A beautiful example of how powerful and long-lasting priors can be when processing touch comes from the Aristotle illusion, first accounted by Aristotle (384–322 B.C.) in the essay ‘‘On dreams’’. In this illusion, a single object is touched with crossed fingers, but strikingly, the individual perceives two rather than one object (Benedetti, 1985). The illusion probably occurs because our brain fails to account for the actual crossed posture of the fingers and processes the sensations arising from the touched object as if the fingers were in their usual uncrossed posture (or, similarly, as if the touch was coming from the anatomical side). Only after months of exposure to this unusual configuration of the fingers, touch takes the real posture into account, and the illusion disappears (Benedetti, 1991). Closely related to this, when two taps are applied in sequence to crossed hands at short intervals, many participants systematically report the first stimuli to occur on the opposite hand (Yamamoto and Kitazawa, 2001; Kóbor et al., 2006; Heed and Azañón, 2014). This can be interpreted as people initially perceiving the location of the touch from the visual side where the hand usually is in space. For instance, for a right-hand touch, the right side of space, which now is occupied by the left hand, would serve as a prior spatial location. Evidence for this comes from visuotactile attention paradigms. When a touch is presented on a crossed hand, quickly followed by a light (<60 ms later), participants are faster in responding to the light in opposite-side (i.e., anatomically congruent) trials than in same-side (i.e., spatially congruent trials). Thus, touches to the left hand, now placed on the right side, facilitate processing of left hemispace visual events and vice versa (see Figures 3A,B; Azañón and Soto-Faraco, 2008a,b; Azañón et al., 2010a). In a similar fashion, a proportion of saccades or reaches directed towards a touch on a crossed limb are initially directed towards the opposite limb, as if they were uncrossed, and then corrected online, several hundred ms later (see Figure 3C; (Groh and Sparks, 1996; Overvliet et al., 2011; see Brandes and Heed, 2015 for reaching trajectory). Finally, it has been shown that disruption of tactile-proprioceptive integration by transcranial magnetic stimulation (TMS) in humans, over the putative right ventral intraparietal cortex, induced participants to underestimate the height of touches delivered to the arm (Azañón et al., 2010b). In this study, participants placed their left arm upright, close to the face, and participants discriminated the location of a touch on the arm, with respect to a touch on the face. The location of the touches on the arm was perceived as coming from a lower position. This could suggest that disruption of tactile-proprioceptive integration by parietal TMS forced touch to rely on an offline proprioceptive representation, in which the arms are represented in their prototypical position, with hands below the face (Azañón et al., 2010b).
In Longo et al. (2010), we introduced the idea that at early stages of tactile processing, and hence, before touch is integrated with an up-to-date proprioceptive representation, the brain assumes for each touch, a stored representation of a canonical posture for that touch. Later, this a priori information is overtaken by the actual proprioceptive information or simply weighted less. However, the evidence put forward for this claim (and reviewed in the previous paragraph) does not differentiate between spatial visual priors and canonical postural representations. From a spatial prior perspective, touch is referred in these examples, to the location in visual space where the hand normally is (i.e., the right side of space, for the right hand, or below the face in Azañón et al., 2010b TMS example), without need to account for a particular proprioceptive configuration. From a canonical perspective, however, this effect would be driven by a stored representation of the prototypical layout of the limbs (i.e., a default proprioceptive condition that assumes that the hands are not crossed and placed below the face; see for instance Yamamoto and Kitazawa, 2001).
Regardless of whether these effects are driven by purely visual or by purely proprioceptive priors or a combination of the two, definite and direct evidence for the existence of priors in touch is needed. Note that some direct hypotheses arise from the previous discussion: (1) If tactile stimuli are processed taking into account prior information (in particular, a priori spatial location), one might expect tactile localization biases to occur. (2) If the same skin area is stimulated under different postures, localization biases for that skin area should converge to particular areas of space. Thus, it should be possible to track experimentally these priors touching the same body areas across changes in posture. (3) If tactile stimuli are first processed using a priori information and this a priori information is subsequently adjusted based on the actual spatial location of body parts, then, larger biases should be found at early stages of tactile processing, as compared to later. With regard to possible neural substrates, multimodal neurons with “intermediate” receptive fields in the posterior parietal cortex, and whose activity is gain modulated by the position of the eyes in the orbit, the hand or the head (Pouget et al., 2002; Avillac et al., 2005; Chang and Snyder, 2010) might be able to encode visual priors. Similarly, area PE in the superior parietal lobule (equivalent to BA 5 in the human brain) might be involved in the processing of proprioceptive priors. Some PE neurons in the monkeys react to complex body postures involving several joints (Sakata et al., 1973), and some also respond to tactile stimuli, but only if the limbs and joints are placed in certain positions. Indeed, Sakata and co-workers already suggested that such neurons would be able to encode the spatial position of the touching object relative to the body axis (Sakata et al., 1973).
It is worth noting that the idea of canonical representations of the body is not new. Already in the 1970s, Bromage and Melzack oberved that during the induction of reversible upper and lower limb deafferentation, via brachial plexus and epidural anesthesia, participants reported highly stereotyped postures, with arms and legs at their anatomical side, with joints approximately midway through their range of flexion (above the abdomen or lower chest for the arms, and with the legs semiflexed at the hips and (Knees; Melzack and Bromage, 1973; Bromage and Melzack, 1974; see also Gross et al., 1974; Gross and Melzack, 1978). More recent studies have shown that a fully extended finger, wrist, and elbow become a flexed phantom after ischemic anesthesia, though some aspects of the induced phantom sensation change according to the posture held at the time of anesthesia (Inui et al., 2011, 2012a,b). Even though Bromage and Melzack considered these canonical representations outside the frame of tactile processing, the type of proprioceptive priors proposed here might be fundamentally equivalent. Indeed, the authors assumed that this postural archetype may arise by the activity in neural cell assemblies that are developed by earlier sensorimotor activities encountered in a life time, therefore including touch (Melzack and Bromage, 1973). Similarly, a recent study has shown preferential associations between the thumb and the index finger and the relative spatial positions of “top” and “bottom,” suggesting that body parts and spatial locations are stably associated (Romano et al., 2017). In this study, participants were exposed to touches on either the thumb or index fingers. Both hands were placed in front of the body, one on top of the other, with the four stimulated fingers shaping the vertices of an imaginary square and with each homologous fingers (index and thumb) facing each other without touching. In this way, the thumb could be on a relative top position or on a bottom position and vice versa for the index finger. Participants received a single tactile stimulation at one of the four possible locations and were asked to discriminate as quickly as possible whether the top or bottom finger had been touched. The authors found consistent preferential associations between the index finger and the top position and between the thumb and the bottom position, both with and without vision. In this paper, the authors speculated that a canonical postural representation might contribute to somatosensory spatial processing and associate this representation to the fact that for many common grasping actions the index finger is placed in a relatively higher location than the thumb (Romano et al., 2017). This is in agreement with the idea that long-term sensorimotor experience, such as grasping, can create specific functional categories in the brain, which can modulate early stages of somatosensory processing (Shen et al., 2018).
Examples of Integration of Touch and Online Proprioceptive Information
The idea put forward in this section is that at early stages of tactile processing, possibly before the brain had time to incorporate an online representation of current posture, touch is integrated with (or influenced by) stored representations. This is, however, independent of two facts, i.e., touch necessarily relies on up-to-date proprioceptive information to generate locations in external space, and localization of body parts is tightly linked to visual processing (Limanowski and Blankenburg, 2016). Thus, integration between touch and proprioception for tactile localization often co-occurs with vision (note that other forms of interactions, e.g., with motor commands, are omitted for the sake of brevity; Hermosillo et al., 2011).
The fact that tactile localization is affected by changes in posture (such as hand crossing) is evidence of the integration of touch with online proprioceptive information (Yamamoto and Kitazawa, 2001). There are many other examples in the literature showing effects of posture on somatosensory processing, even when these are visually induced (highlighting the role of vision in body parts localization; Gallace and Spence, 2005; Azañón and Soto-Faraco, 2007; Folegatti et al., 2009). For example, localizing the order of two touches, applied one to each uncrossed hand, becomes easier when the horizontal distance between the two hands increases (Shore et al., 2005). This improvement is observed, even if the separation is not physical, but visually introduced by mirror reflection (Soto-Faraco et al., 2004; Gallace and Spence, 2005). This is the case also for tactile localization with hands crossed (Roberts et al., 2003), which also improves when the separation spans other spatial dimensions (vertical and depth; Azañón et al., 2016a).
Studies on tactile spatial attention further demonstrate the strong interconnection between online postural information and touch (Lakatos and Shepard, 1997; Aglioti et al., 1999; Heed and Röder, 2010). For example, tactile attention to one hand in healthy individuals improves by separating the arms (e.g., Driver and Grossenbacher, 1996; Soto-Faraco et al., 2004). When the task requires switching attention from one hand to the other, then participants’ performance improves by reducing the distance between the arms (Lakatos and Shepard, 1997). Furthermore, when participants discriminate the elevation of a tactile target applied to the index finger or thumb of one hand, there is facilitation from a simultaneous touch on the unattended hand when it is presented in a congruent (e.g., both up) rather than in an incongruent elevation, regardless of the orientation taken by the hand, and therefore the actual finger stimulated (e.g., whether both index fingers are placed on top of the thumbs or a single hand is rotated, and the thumb is on the top of the index finger; Soto-Faraco et al., 2004). Altogether, these results suggest that tactile attention is affected by the posture of the touched body part, given that performance is modulated by the distance and orientation of the body parts despite the somatotopic relationship across the involved skin sites is kept constant in the brain (see also Rinker and Craig, 1994; though see Evans and Craig, 1991; Evans et al., 1992; Röder et al., 2002; Haggard et al., 2006, and Kuroki et al., 2010 for evidence regarding a somatotopic dominance in tactile localization).
Research on patients provides further evidence of the influence of posture in tactile processing. This is the case, for instance, of tactile extinction, already defined in the previous section, or tactile hemineglect, in which tactile stimulation of the contralesional limb (usually the left) is neglected (Vallar, 1997; Driver and Vuilleumier, 2001). The strength of tactile inattention is reduced by the location of the affected body part in space. Thus, some patients improve tactile detection at the contralesional hand when it crosses the midline to the ipsilesional side (Smania and Aglioti, 1995; Moro et al., 2004) or even within the same hemispace when the affected hand crosses the other hand (Aglioti et al., 1999; Moro et al., 2004). Further support comes from patients with extinction anchored to different body parts. In particular, these patients extinguish touches that are presented at the left-most side region of the stimulated body part in external space, say the limb, the hand, or the finger (with respect to their long axis), regardless of the spatial orientation taken by them (e.g. palm up or down; Moscovitch and Behrmann, 1994; Tinazzi et al., 2000; see Medina and Rapp, 2008 for an example in other neurological patients).
Overall, these studies show the impact of postural information in tactile localization. It is important to stress, however, that postural information arises not only from proprioception, but in many instances also from vision. The role of vision in body part localization is evident when a conflict between proprioception and vision is introduced (Rossetti et al., 1995). For instance, in a recent study, Lohmann and Butz (2017) introduced a virtual dissociation of proprioceptive and visual hand position information by combining immersive virtual reality with online motion capturing. They showed that participants unknowingly shifted their hands to compensate for the visual shift. Perhaps the most classical approach to induce visuo-proprioceptive conflict, however, is the rubber hand illusion (RHI, Botvinick and Cohen, 1998). In this classical illusion, participants observe a fake hand being stroked while their real (unseen) hand is synchronously touched. After several seconds of simultaneous stroking, participants tend to perceive the felt tactile sensation as originating from the rubber hand. This usually results in a feeling of ownership and a relocation of the perceived position of the real hand towards the rubber hand (Botvinick and Cohen, 1998; see also Tsakiris and Haggard, 2005). By combining the rubber hand illusion with temporal order judgments with hands crossed, Azañón and Soto-Faraco (2007), found that observing a pair of uncrossed rubber hands reduces the deficit of localizing touches at the hands when crossed. Interestingly, this modulation was mostly observed when visual information about the rubber hands could be attributed to one’s own actions (i.e., when movements of the real hand were mirrored by movements of the rubber hand, in an anatomical fashion), highlighting the role not only of visual information in tactile remapping but also of motor information and the sense of agency.
In summary, we have shown the profound effect that postural information has on tactile processing. However, we have also shown that this is not always the case. Early during development, and in individuals deprived from vision, touch is unaffected by the configuration of the limbs (Röder et al., 2004; Bremner et al., 2008b). Thus, active interaction with the environment and presence of visual inputs seem to modify the way we process and localize touch later in life. As a result of this same interaction, some postural configurations and spatial locations might become associated to particular touches over time, producing what we called canonical postural and spatial priors. We argued that these priors could serve as reference points for localization of tactile events, producing more accurate and faster tactile responses, although biased towards the prior location or proprioceptive configuration. The hypothesis that canonical priors might influence tactile processing is still speculative; however, a growing body of results, some of which have been reviewed here, provides increasing evidence of biases in tactile localization that fit well with the existence of such priors.
Integration of Tactile Information with Representations of Body Size and Shape
The final form of integration we will discuss is integration of immediate tactile signals with stored representations of body size and shape. Several forms of perception involve referencing sensory signals to models of the body itself. For example, the use of convergence angles for visual depth perception requires that the distance between the two eyes be known (Banks, 1988), while the use of temporal differences when sounds reach the two ears for auditory localization requires that head width be known (Aslin et al., 1983). Other studies have shown, for example, that representation of eye-height affects perception of the passability of doorways (Warren and Whang, 1987; Leyrer et al., 2015), hand size affects the visual size perception (Linkenauger et al., 2010, 2014), and arm length affects the size of peripersonal space (Longo and Lourenco, 2007; Lourenco et al., 2011) and perception of visual distance (Linkenauger et al., 2015). These issues are especially acute in touch, given that the primary receptor surface (i.e., the skin) is physically co-extensive with the body itself.
The Role of a Body Model in Tactile Distance Perception
A central part of the model of somatoperceptual information processing proposed by Longo et al. (2010) was therefore a stored representation of body size and shape, what they called the body model. Stimulation of even single mechanoreceptive afferent fibers in the human median nerve can produce clearly localized tactile sensations (Schady et al., 1983). Imagine, however, that two distinct points on the hand are touched. There is nothing in either of the two resulting signals or their combination that specifies how far apart the two stimuli are. Perceiving the distance between two stimulus locations on opposite sides of the hand effectively reduces the problem of knowing how big one’s hand is. Longo et al. (2010) proposed that this is achieved by combining the location of touch within primary somatotopic maps in somatosensory cortex with the body model.
Evidence in support of this interpretation comes from studies showing that illusions which alter the perceived size or shape of the body produce corresponding changes in the perception of tactile distance. Taylor-Clarke et al. (2004), for example, showed participants a magnified video image of their forearm alongside a minimized image of their hand. Subsequently, the relative perceived distance between two touches was expanded on the forearm and compressed on the hand. Similarly, de Vignemont et al. (2005) explored this issue using the so-called vibrotactile illusion. In the vibrotactile illusion, vibration applied to a muscle tendon produces an illusion of muscle lengthening and a corresponding illusion of proprioceptive limb displacement (Goodwin et al., 1972). Lackner (1988) showed that when this illusion was generated while the affected limb was in continuous contact with another part of the body, illusory changes of experienced body part size could be produced (i.e., the “Pinocchio illusion”). De Vignemont et al. (2005) used this method to produce the illusion that the index finger was longer or shorter than its actual size and showed that such changes affected the perceived distance between touches on the finger, compared to a control skin location (the forehead). Similar results have also been reported in other studies (Bruno and Bertamini, 2010; Tajadura-Jiménez et al., 2012).
Further evidence that higher level representation of the body shapes the perception of tactile distance comes from studies showing that the segmentation of the body into discrete parts produces categorical perception effects, with perceived tactile distances being expanded across joint boundaries (de Vignemont et al., 2009; Le Cornu Knight et al., 2014, 2017; Shen et al., 2018). Similarly, tool use, which can be interpreted as a functional extension of the body (e.g., Maravita and Iriki, 2004), has recently been shown to produce systematic changes in the perception of tactile distance on the arm wielding the tool (Canzoneri et al., 2013; Miller et al., 2014, 2017a,b). Moreover, the nature of these effects is determined by the relation between the tool and the body: a long stick altered touch on the forearm but not the hand, whereas a hand-shaped tool altered touch on the hand but not the forearm (Miller et al., 2014).
Baseline Distortions of Tactile Distance Perception and the Pixel Model
Intriguingly, even at baseline, there are large misperceptions of tactile distance, which have been investigated since the 19th century. In his classic work, Weber (1996) noticed that as he moved the two points of a compass across his skin it felt like the distance between them increased as they moved from a region of relatively low sensitivity (e.g., the forearm) to a region of higher sensitivity (e.g., the palm of the hand). Subsequent research has replicated these results and found that the perceived distance between touches on the skin has a systematic relation to the relatively sensitivity of different skin regions (Goudge, 1918; Cholewiak, 1999; Taylor-Clarke et al., 2004; Anema et al., 2008; Miller et al., 2016), an effect now known as Weber’s illusion.
Interestingly, similar results have also been found comparing the perceived distance between points aligned in different orientations on a single skin surface. For example, Longo and Haggard (2011) found that the perceived distance between touches on the hand dorsum was about 40% larger when the touches were oriented across the width of the hand, than along hand length. Other studies have reported similar results (Longo and Sadibolova, 2013; Calzolari et al., 2017; Longo and Golubova, 2017; Longo, 2017b; Tamè et al., 2017b), and similar anisotropies have been described on a number of skin regions, including the forearm (Green, 1982; Le Cornu Knight et al., 2014), the thigh (Green, 1982), the shin (Stone et al., 2018), and the forehead (Longo et al., 2015a; Fiori and Longo, 2018). Intriguingly, the direction of this effect appears to be the same on all skin regions where anisotropy has been reported, with distances aligned with body width overestimated compared to those aligned with body length or height. However, the magnitude of anisotropy appears to differ systematically across the skin, suggesting that it arises from factors specific to each skin surface rather than a more general perceptual or cognitive bias.
In previous work, we have suggested that such effects may arise from the geometry of the receptive fields (RFs) of neurons in somatosensory cortex, based on what we called the “pixel model” (Longo and Haggard, 2011; Longo, 2017a). The central idea of this model is that tactile RFs in a somatotopic map are treated like the pixels of a two-dimensional spatial image of the body, with distances calculated by counting the number of unstimulated RFs between two activation peaks. Because the RFs representing sensitive skin regions are smaller than those representing less-sensitive regions (Powell and Mountcastle, 1959; Sur et al., 1980), any given stimulus will have more unstimulated RFs between peaks if applied on a sensitive than a less-sensitive surface, potentially accounting for the classic form of Weber’s illusion. Similarly, the RFs of neurons representing the hairy skin of the limbs are generally oval-shaped (rather than circular), with the long axis of the oval aligned with the proximo-distal limb axis (Powell and Mountcastle, 1959; Brooks et al., 1961; Alloway et al., 1989). This anisotropy of RF geometry can potentially account for the perceptual anisotropies described above, given that the spacing between the RFs of adjacent neurons in somatotopic maps is known to be a constant proportion of RF size (Sur et al., 1980). Recent results have been consistent with this model in showing that tactile distance anisotropies can be well characterized by geometrically simple deformations (e.g., stretches) of tactile space (Longo and Golubova, 2017; Fiori and Longo, 2018).
Tactile Distance Perception and Clinical Disorders of Body Image
A number of recent studies have reported disruption of tactile distance perception in clinical disorders (e.g., Keizer et al., 2011, 2012; Scarpina et al., 2014; Spitoni et al., 2015; Mölbert et al., 2016; Engel and Keizer, 2017). For example, Keizer and colleagues (Keizer et al., 2011, 2012) found that in comparison with healthy controls, patients with anorexia nervosa overestimated tactile distances on both the belly and hand. In a subsequent study, Spitoni et al. (2015) compared tactile distances on the belly and sternum. Patients with anorexia overestimated distances on the belly compared to the sternum, but only when stimuli were aligned with the width of the body and not when they were aligned with body length. This effect is intriguing in that it shows specificity in the distortions of tactile distance perception shown by the patients that mirror their subjective body image (i.e., the fact that they experience their body as fatter than it actually is). Thus, this result provides further evidence for a deep relation between the experience of tactile distance and higher level representation of the body (cf. Longo, 2015).
There is also some evidence that the illusions of tactile distance perception we have described above mirror distortions of body perception in other domains [for review, (see Azañón et al., 2016b; Longo, 2017a)]. For example, studies investigating body representations underlying proprioceptive position sense have reporting similar distortions, with overestimation of hand width relative to length (Longo and Haggard, 2010, 2012a; Ganea and Longo, 2017). Similarly, other studies of the explicit body image have also revealed overestimation of body width, using a range of measures including visual comparison (Shontz, 1969; Longo and Haggard, 2012b), the image marking procedure (Meermann, 1983), the moving caliper procedure (Halmi et al., 1977; Dolan et al., 1987), the adjustable light beam apparatus (Thompson and Thompson, 1986; Dolce et al., 1987), and several others (Bianchi et al., 2008; Fuentes et al., 2013; D’Amour and Harris, 2017). The distortions described above of tactile distance perception thus appear to be just one reflection of a broader perceptual bias to overestimate body width, which appears in many types of task.
Discussion
In this review, we have explored two aspects of tactile processing that were not considered in the model proposed by Longo et al. (2010), i.e., the integration of touch across the two sides of the body and the use of stored proprioceptive information about the location of touch in space. In addition, we have reviewed recent results concerning the integration of tactile signals with representations of body size and shape since we developed the model.
Regarding the integration of touch across body sides, a large body of evidence, as discussed in the first section, suggests that the integration of tactile signals between the two sides of the body is likely to occur at early stages of tactile processing, i.e., within the primary somatosensory cortex. This line of evidence challenges the textbook account that SI supports only unilateral tactile representations of the contralateral side of the body, whereas structures beyond SI, in particular SII, support bilateral tactile representations. Therefore, in the construction of the somatic percept, the interhemispheric transfer of tactile information occurs very early in time and depends on the spatial and temporal characteristics of the stimuli (Tamè et al., 2012, 2015), the type of task (Tamè et al., 2011, 2014, 2016), as well as the relative position of the parts of the two sides of the body in space (Tamè et al., 2011, 2017c). We propose that such integration occurs in areas 1 and 2 of the primary somatosensory cortex through transcallosal connections as shown by the neuroimaging studies we described (e.g., Tamè et al., 2012, 2015). Following this integrative process, information is then sent to other brain areas within SI (i.e., 3a, 3b), parietal areas, as well as the motor and premotor cortices.
In our previous model (Longo et al., 2010), we proposed that three different types of body representations were required to process touch. Namely, the superficial schema, mediating localization of somatic sensations on the body surface; the model of body size and shape, which was discussed in the last section of this review, and the postural schema, an online and up-to-date proprioceptive representation of the limbs in space. Nonetheless, several considerations converge to support the idea that the processing of touch also involves an offline representation of the most plausible spatial locations for a given touch (Azañón and Soto-Faraco, 2008a; Overvliet et al., 2011) or the most possible configurations of the body in space (Yamamoto and Kitazawa, 2001; Romano et al., 2017). We suggest that these representations or stored information are tightly linked to the postural schema, specially, in the particular case of canonical proprioceptive priors. Minor deviations from this template are maximally informative for comparing current body posture and, in this way, retrieving the up-to-date body schema in a dynamic way. In this hypothetical framework, online sensory information about the tactile stimuli on a body part in a given posture (postural schema) would be combined with information about this offline proprioceptive standard, every time a touch is presented. Consequently, when online information is accurate, both schemata are combined to increase accuracy and speed of tactile processing, as the prior should be seen as the statistical mean for all co-occurrences between touch and this particular body configuration encoded throughout a lifetime.
Figure 4 shows an updated depiction of Longo et al.’s (2010) model where we have included the notion that touch is necessarily integrated across the two sides of the body. In Figure 4, we suggest that touch is integrated between the two sides of the body before the processing that constructs percepts and experiences of somatic objects and events and of one’s own body (i.e., somatoperception). We have also included a fourth body representation, a canonical prior, to denote the use of priors in the localization of touch. This prior would interact mostly with the postural schema to produce a fast and accurate, though sometimes biased, localization of touch in space.
Figure 4. Revised model of somatosensory processing. Addition of the concepts “side integration” and “postural priors” (in blue) to the model initially proposed by Longo et al., 2010. Adapted from Longo et al. (2010). © 2010 by Elsevier. Permission for the use of the image has been obtained from the Elsevier.
Taken together, with the inclusion of the concepts of body laterality and prior information, this review provides a more comprehensive conceptualization of tactile processing than our previous model (Longo et al., 2010, 2015a,b). Furthermore, with the revision of a wide range of recent neuropsychological, neuroimaging, and neurophysiological data, we provide evidence that the claims we made 8 years ago are still up-to-date.
Author Contributions
The three authors contributed equally.
Funding
This paper was supported by European Research Council grant (ERC-2013-StG- 336050) under the FP7 to MRL.
Conflict of Interest Statement
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
References
Aglioti, S., Smania, N., and Peru, A. (1999). Frames of reference for mapping tactile stimuli in brain-damaged patients. J. Cogn. Neurosci. 11, 67–79. doi: 10.1162/089892999563256
Allison, T., McCarthy, G., Wood, C. C., Williamson, P. D., and Spencer, D. D. (1989). Human cortical potentials evoked by stimulation of the median nerve. II. Cytoarchitectonic areas generating long-latency activity. J. Neurophysiol. 62, 711–722. doi: 10.1152/jn.1989.62.3.711
Alloway, K. D., Rosenthal, P., and Burton, H. (1989). Quantitative measurements of receptive field changes during antagonism of GABAergic transmission in primary somatosensory cortex of cats. Exp. Brain Res. 78, 514–532. doi: 10.1007/BF00230239
Anema, H. A., Wolswijk, V. W. J., Ruis, C., and Dijkerman, H. C. (2008). Grasping Weber’s illusion: the effect of receptor density differences on grasping and matching. Cognitive Neuropsychology 25, 951–967. doi: 10.1080/02643290802041323
Aslin, R. N., Pisoni, D. B., and Jusczyk, P. W. (1983). “Auditory development and speech perception in infancy” in Handbook of child psychology: {Infancy} and developmental pyschobiology. eds. M. M., Haith, J. J., Campos, and P. H., Mussen Vol. 2 (New York: Wiley), 573–687.
Avillac, M., Denève, S., Olivier, E., Pouget, A., and Duhamel, J. -R. (2005). Reference frames for representing visual and tactile locations in parietal cortex. Nat. Neurosci. 8, 941–949. doi: 10.1038/nn1480
Azañón, E., Camacho, K., Morales, M., and Longo, M. R. (2018). The sensitive period for tactile remapping does not include early infancy. Child Dev. 4, 1394–1404. doi: 10.1111/cdev.12813
Azañón, E., Camacho, K., and Soto-Faraco, S. (2010a). Tactile remapping beyond space. Eur. J. Neurosci. 31, 1858–1867. doi: 10.1111/j.1460-9568.2010.07233.x
Azañón, E., Longo, M. R., Soto-Faraco, S., and Haggard, P. (2010b). The posterior parietal cortex remaps touch into external space. Curr. Biol. 20, 1304–1309. doi: 10.1016/j.cub.2010.05.063
Azañón, E., Mihaljevic, K., and Longo, M. R. (2016a). A three-dimensional spatial characterization of the crossed-hands deficit. Cognition 157, 289–295. doi: 10.1016/j.cognition.2016.09.007
Azañón, E., and Soto-Faraco, S. (2007). Alleviating the ‘crossed-hands’ deficit by seeing uncrossed rubber hands. Exp. Brain Res. 182, 537–548. doi: 10.1007/s00221-007-1011-3
Azañón, E., and Soto-Faraco, S. (2008a). Changing reference frames during the encoding of tactile events. Curr. Biol. 18, 1044–1049. doi: 10.1016/j.cub.2008.06.045
Azañón, E., and Soto-Faraco, S. (2008b). Spatial remapping of tactile events: assessing the effects of frequent posture changes. Commun. Integrat. Biol. 1, 45–46. doi: 10.4161/cib.1.1.6724
Azañón, E., Stenner, M. -P., Cardini, F., and Haggard, P. (2015). Dynamic tuning of tactile localization to body posture. Curr. Biol. 25, 512–517. doi: 10.1016/j.cub.2014.12.038
Azañón, E., Tamè, L., Ferrè, E. R., Maravita, A., Tajadura-Jiménez, A., Linkenauger, S. A., et al. (2016b). Multimodal contributions to body representation. Multisens. Res. 29, 635–661. doi: 10.1163/22134808-00002531
Badde, S., Röder, B., and Heed, T. (2015). Flexibly weighted integration of tactile reference frames. Neuropsychologia 70, 367–374. doi: 10.1016/j.neuropsychologia.2014.10.001
Banks, M. S. (1988). “Visual recalibration and the development of contrast and optical flow perception” in The minnesota symposia on child psychology. ed. A., Yonas (Hillsdale, NJ: Erlbaum), 145–196.
Bauman, G. I. (1932). Absence of the cervical spine: Klippel-Feil syndrom. J. Am. Med. Assoc. 98, 129–132. doi: 10.1001/jama.1932.02730280037009
Begum Ali, J., Cowie, D., and Bremner, A. J. (2014). Effects of posture on tactile localization by 4 years of age are modulated by sight of the hands: evidence for an early acquired external spatial frame of reference for touch. Dev. Sci. 17, 935–943. doi: 10.1111/desc.12184
Begum Ali, J., Spence, C., and Bremner, A. J. (2015). Human infants’ ability to perceive touch in external space develops postnatally. Curr. Biol. 25, R978–R979. doi: 10.1016/j.cub.2015.08.055
Bender, M. B. (1945). Extinction and precipitation of cutaneous sensations. Arch. Neurol. Psychiatr. 54, 1. doi: 10.1001/archneurpsyc.1945.02300070011001
Benedetti, F. (1985). Processing of tactile spatial information with crossed fingers. J. Exp. Psychol. Hum. Percept. Perform. 11, 517–525. doi: 10.1037/0096-1523.11.4.517
Benedetti, F. (1991). Perceptual learning following a long-lasting tactile reversal. J. Exp. Psychol. Hum. Percept. Perform. 17, 267–277. doi: 10.1037/0096-1523.17.1.267
Bianchi, I., Savardi, U., and Bertamini, M. (2008). Estimation and representation of head size (people overestimate the size of their head - evidence starting from the 15th century). Br. J. Psychol. 99, 513–531. doi: 10.1348/000712608X304469
Bisiach, E., and Berti, A. (1995). “Consciousness in dyschiria” in The cognitive neurosciences. ed. M. S., Gazzaniga (Cambridge, MA: MIT Press), 1331–1340.
Blanchard, C., Roll, R., Roll, J. -P., and Kavounoudias, A. (2013). Differential contributions of vision, touch and muscle proprioception to the coding of hand movements. PLoS One 8:e62475. doi: 10.1371/journal.pone.0062475
Botvinick, M., and Cohen, J. (1998). Rubber hands “feel” touch that eyes see. Nature 391:756. doi: 10.1038/35784
Brandes, J., and Heed, T. (2015). Reach Trajectories Characterize Tactile Localization for Sensorimotor Decision Making. J. Neurosci. 35, 13648–13658. doi: 10.1523/JNEUROSCI.1873-14.2015
Bremner, A. J., Holmes, N. P., and Spence, C. (2008a). Infants lost in (peripersonal) space? Trends Cogn. Sci. 12, 298–305. doi: 10.1016/j.tics.2008.05.003
Bremner, A. J., Mareschal, D., Lloyd-Fox, S., and Spence, C. (2008b). Spatial localization of touch in the first year of life: early influence of a visual spatial code and the development of remapping across changes in limb position. J. Exp. Psychol. Gen. 137, 149–162. doi: 10.1037/0096-3445.137.1.149
Bromage, P. R., and Melzack, R. (1974). Phantom limbs and the body schema. Can. Anaesth. Soc. J. 21, 267–274. doi: 10.1007/BF03005731
Brooks, V. B., Rudomin, P., and Slayman, C. L. (1961). Peripheral receptive fields of neurons in the cat’s cerebral cortex. J. Neurophysiol. 24, 302–325. doi: 10.1152/jn.1961.24.3.302
Bruno, N., and Bertamini, M. (2010). Haptic perception after a change in hand size. Neuropsychologia 48, 1853–1856. doi: 10.1016/j.neuropsychologia.2010.01.006
Bruno, N., and Pavani, F. (2018). Perception: a multisensory perspective, Vol. 1: (Oxford, UK: Oxford University Press).
Calzolari, E., Azañón, E., Danvers, M., Vallar, G., and Longo, M. R. (2017). Adaptation aftereffects reveal that tactile distance is a basic somatosensory feature. Proc. Natl. Acad. Sci. U. S. A. 114, 4555–4560. doi: 10.1073/pnas.1614979114
Caminiti, R., and Sbriccoli, A. (1985). The callosal system of the superior parietal lobule in the monkey. J. Comp. Neurol. 237, 85–99. doi: 10.1002/cne.902370107
Canzoneri, E., Ubaldi, S., Rastelli, V., Finisguerra, A., Bassolino, M., and Serino, A. (2013). Tool-use reshapes the boundaries of body and peripersonal space representations. Exp. Brain Res. 228, 25–42. doi: 10.1007/s00221-013-3532-2
Chang, S. W. C., and Snyder, L. H. (2010). Idiosyncratic and systematic aspects of spatial representations in the macaque parietal cortex. Proc. Natl. Acad. Sci. 107, 7951–7956. doi: 10.1073/pnas.0913209107
Cheng, K., Shettleworth, S. J., Huttenlocher, J., and Rieser, J. J. (2007). Bayesian integration of spatial information. Psychol. Bull. 133, 625–637. doi: 10.1037/0033-2909.133.4.625
Cholewiak, R. W. (1999). The perception of tactile distance: influences of body site, space, and time. Perception 28, 851–46.
Chung, Y. G., Han, S. W., Kim, H. -S., Chung, S. -C., Park, J. -Y., Wallraven, C., et al. (2014). Intra- and inter-hemispheric effective connectivity in the human somatosensory cortex during pressure stimulation. BMC Neurosci. 15:43. doi: 10.1186/1471-2202-15-43
Collignon, O., Charbonneau, G., Lassonde, M., and Lepore, F. (2009). Early visual deprivation alters multisensory processing in peripersonal space. Neuropsychologia 47, 3236–3243. doi: 10.1016/j.neuropsychologia.2009.07.025
Convento, S., Rahman, M. S., and Yau, J. M. (2018). Selective attention gates the interactive crossmodal coupling between perceptual systems. Curr. Biol. 28, 746–752:e5. doi: 10.1016/j.cub.2018.01.021
Corradi-Dell’Acqua, C., Tomasino, B., and Fink, G. R. (2009). What is the position of an arm relative to the body? Neural correlates of body schema and body structural description. J. Neurosci. 29, 4162–4171. doi: 10.1523/JNEUROSCI.4861-08.2009
Craig, J. C. (1974). Vibrotactile difference thresholds for intensity and the effect of a masking stimulus. Percept. Psychophys. 15, 123–127. doi: 10.3758/BF03205839
Craig, J. C. (1983). The role of onset in the perception of sequentially presented vibrotactile patterns. Percept. Psychophys. 34, 421–432. doi: 10.3758/BF03203057
D’Amour, S., and Harris, L. R. (2017). Perceived face size in healthy adults. PLoS One 12:e0177349. doi: 10.1371/journal.pone.0177349
de Vignemont, F., Ehrsson, H. H., and Haggard, P. (2005). Bodily illusions modulate tactile perception. Curr. Biol. 15, 1286–1290. doi: 10.1016/j.cub.2005.06.067
de Vignemont, F., Majid, A., Jola, C., and Haggard, P. (2009). Segmenting the body into parts: evidence from biases in tactile perception. Q. J. Exp. Psychol. 62, 500–512. doi: 10.1080/17470210802000802
Dempsey-Jones, H. E., Harrar, V., Oliver, J., Johansen-Berg, H., Spence, C., and Makin, T. R. (2015). Transfer of tactile perceptual learning to untrained neighbouring fingers reflects natural use relationships. J. Neurophysiol. 115, 1088–1097. doi: 10.1152/jn.00181.2015
Dolan, B. M., Birtchnell, S. A., and Lacey, J. H. (1987). Body image distortion in non-eating disordered women and men. J. Psychosom. Res. 31, 513–520. doi: 10.1016/0022-3999(87)90009-2
Dolce, J. J., Thompson, J. K., Register, A., and Spana, R. E. (1987). Generalization of body size distortion. Int. J. Eat. Disord. 6, 401–408. doi: 10.1002/1098-108X(198705)6:3<401::AID-EAT2260060310>3.0.CO;2-Z
Driver, J., and Grossenbacher, P. G. (1996). “Multimodal spatial constraints on tactile selective attention” in Attention and performance 16: information integration in perception and communication. eds. T., Inui, and J. L., McClelland (Cambridge, MA, US: The MIT Press), 209–235.
Driver, J., and Spence, C. (1998). Cross-modal links in spatial attention. Philos. Trans. R. Soc. B 353, 1319–1331. doi: 10.1098/rstb.1998.0286
Driver, J., and Vuilleumier, P. (2001). Perceptual awareness and its loss in unilateral neglect and extinction. Cognition 79, 39–88. doi: 10.1016/S0010-0277(00)00124-4
Eickhoff, S. B., Jbabdi, S., Caspers, S., Laird, A. R., Fox, P. T., Zilles, K., et al. (2010). Anatomical and functional connectivity of cytoarchitectonic areas within the human parietal operculum. J. Neurosci. 30, 6409–6421. doi: 10.1523/JNEUROSCI.5664-09.2010
Engel, M. M., and Keizer, A. (2017). Body representation disturbances in visual perception and affordance perception persist in eating disorder patients after completing treatment. Sci. Rep. 7:16184. doi: 10.1038/s41598-017-16362-w
Evans, P. M., and Craig, J. C. (1991). Tactile attention and the perception of moving tactile stimuli. Percept. Psychophys. 49, 355–364. doi: 10.3758/BF03205993
Evans, P. M., Craig, J. C., and Rinker, M. A. (1992). Perceptual processing of adjacent and nonadjacent tactile nontargets. Percept. Psychophys. 52, 571–581. Retrieved from http://www.ncbi.nlm.nih.gov/pubmed/1437490
Farmer, S. F., Ingram, D. A., and Stephens, J. A. (1990). Mirror movements studied in a patient with Klippel-Feil syndrome. J. Physiol. 428, 467–484. doi: 10.1113/jphysiol.1990.sp018222
Felician, O., Romaiguère, P., Anton, J.-L., Nazarian, B., Roth, M., Poncet, M., et al. (2004). The role of human left superior parietal lobule in body part localization. Ann. Neurol. 55, 749–751. doi: 10.1002/ana.20109
Fiori, F., and Longo, M. R. (2018). Tactile distance illusions reflect a coherent stretch of tactile space. Proc. Natl. Acad. Sci. U. S. A. 115, 1238–1243. doi: 10.1073/pnas.1715123115
Folegatti, A., de Vignemont, F., Pavani, F., Rossetti, Y., and Farnè, A. (2009). Losing one’s hand: visual-proprioceptive conflict affects touch perception. PLoS One 4:e6920. doi: 10.1371/journal.pone.0006920
Fuentes, C. T., Longo, M. R., and Haggard, P. (2013). Body image distortions in healthy adults. Acta Psychol. 144, 344–351. doi: 10.1016/j.actpsy.2013.06.012
Gallace, A., and Spence, C. (2005). Visual capture of apparent limb position influences tactile temporal order judgments. Neurosci. Lett. 379, 63–68. doi: 10.1016/j.neulet.2004.12.052
Ganea, N., and Longo, M. R. (2017). Projecting the self outside the body: body representations underlying proprioceptive imagery. Cognition 162, 41–47. doi: 10.1016/j.cognition.2017.01.021
Gescheider, G. A., Bolanowski, S. J., Pope, J. V., and Verrillo, R. T. (2002). A four-channel analysis of the tactile sensitivity of the fingertip: frequency selectivity, spatial summation, and temporal summation. Somatosens. Mot. Res. 19, 114–124. doi: 10.1080/08990220220131505
Gescheider, G. A., and Migel, N. (1995). Some temporal parameters in vibrotactile forward masking. J. Acoust. Soc. Am. 98, 3195–3199. doi: 10.1121/1.413809
Ghazanfar, A. A., and Schroeder, C. E. (2006). Is neocortex essentially multisensory? Trends Cogn. Sci. 10, 278–285. doi: 10.1016/j.tics.2006.04.008
Goodwin, G. M., McCloskey, D. I., and Matthews, P. B. (1972). Proprioceptive illusions induced by muscle vibration: contribution by muscle spindles to perception? Science 175, 1382–1384. doi: 10.1126/science.175.4028.1382
Goudge, M. E. (1918). A qualitative and quantitative study of Weber’s illusion. Am. J. Psychol. 29:81. doi: 10.2307/1414107
Graziano, M. S., Yap, G. S., and Gross, C. G. (1994). Coding of visual space by premotor neurons. Science 266, 1054–1057. doi: 10.1126/science.7973661
Green, B. G. (1982). The perception of distance and location for dual tactile pressures. Percept. Psychophys. 31, 315–323. doi: 10.3758/BF03202654
Groh, J. M., and Sparks, D. L. (1996). Saccades to somatosensory targets. I. behavioral characteristics. J. Neurophysiol. 75, 412–427. doi: 10.1152/jn.1996.75.1.412
Gross, Y., and Melzack, R. (1978). Body image: dissociation of real and perceived limbs by pressure-cuff ischemia. Exp. Neurol. 61, 680–688. doi: 10.1016/0014-4886(78)90032-8
Gross, Y., Webb, R., and Melzack, R. (1974). Central and peripheral contributions to localization of body parts: evidence for a central body schema. Exp. Neurol. 44, 346–362. doi: 10.1016/0014-4886(74)90201-5
Haggard, P., Kitadono, K., Press, C., and Taylor-Clarke, M. (2006). The brain’s fingers and hands. Exp. Brain Res. 172, 94–102. doi: 10.1007/s00221-005-0311-8
Halmi, K. A., Goldberg, S. C., and Cunningham, S. (1977). Perceptual distortion of body image in adolescent girls: distortion of body image in adolescence. Psychol. Med. 7, 253–257. doi: 10.1017/S0033291700029330
Han, J., Anson, J., Waddington, G., and Adams, R. (2013). Proprioceptive performance of bilateral upper and lower limb joints: side-general and site-specific effects. Exp. Brain Res. 226, 313–323. doi: 10.1007/s00221-013-3437-0
Head, H., and Holmes, G. (1911). Sensory disturbances from cerebral lesions. Brain 34, 102–254. doi: 10.1093/brain/34.2-3.102
Heed, T., and Azañón, E. (2014). Using time to investigate space: a review of tactile temporal order judgments as a window onto spatial processing in touch. Front. Psychol. 5:76. doi: 10.3389/fpsyg.2014.00076
Heed, T., Buchholz, V. N., Engel, A. K., and Röder, B. (2015). Tactile remapping: from coordinate transformation to integration in sensorimotor processing. Trends Cogn. Sci. 19, 251–258. doi: 10.1016/j.tics.2015.03.001
Heed, T., and Röder, B. (2010). Common anatomical and external coding for hands and feet in tactile attention: evidence from event-related potentials. J. Cogn. Neurosci. 22, 184–202. doi: 10.1162/jocn.2008.21168
Hermosillo, R., Ritterband-Rosenbaum, A., and van Donkelaar, P. (2011). Predicting future sensorimotor states influences current temporal decision making. J. Neurosci. 31, 10019–10022. doi: 10.1523/JNEUROSCI.0037-11.2011
Hlushchuk, Y., and Hari, R. (2006). Transient suppression of ipsilateral primary somatosensory cortex during tactile finger stimulation. J. Neurosci. 26, 5819–5824. doi: 10.1523/JNEUROSCI.5536-05.2006
Holmes, N. P., and Tamè, L. (2018). Multisensory perception: magnetic disruption of attention in human Parietal lobe. Curr. Biol. 28, R259–R261. doi: 10.1016/j.cub.2018.01.078
Huttenlocher, J., Hedges, L. V., Corrigan, B., and Crawford, L. E. (2004). Spatial categories and the estimation of location. Cognition 93, 75–97. doi: 10.1016/j.cognition.2003.10.006
Huttenlocher, J., Hedges, L. V., and Duncan, S. (1991). Categories and particulars: prototype effects in estimating spatial location. Psychol. Rev. 98, 352–376. doi: 10.1037/0033-295X.98.3.352
Inui, N., Masumoto, J., Beppu, T., Shiokawa, Y., and Akitsu, H. (2012a). Loss of large-diameter nerve sensory input changes perceived posture. Exp. Brain Res. 221, 369–375. doi: 10.1007/s00221-012-3181-x
Inui, N., Masumoto, J., Ueda, Y., and Ide, K. (2012b). Systematic changes in the perceived posture of the wrist and elbow during formation of a phantom hand and arm. Exp. Brain Res. 218, 487–494. doi: 10.1007/s00221-012-3040-9
Inui, N., Walsh, L. D., Taylor, J. L., and Gandevia, S. C. (2011). Dynamic changes in the perceived posture of the hand during ischaemic anaesthesia of the arm. J. Physiol. 589, 5775–5784. doi: 10.1113/jphysiol.2011.219949
Iwamura, Y., Iriki, A., and Tanaka, M. (1994). Bilateral hand representation in the postcentral somatosensory cortex. Nature 369, 554–556. doi: 10.1038/369554a0
Iwamura, Y., Tanaka, M., Iriki, A., Taoka, M., and Toda, T. (2002). Processing of tactile and kinesthetic signals from bilateral sides of the body in the postcentral gyrus of awake monkeys. Behav. Brain Res. 135, 185–190. doi: 10.1016/S0166-4328(02)00164-X
Iwamura, Y., Taoka, M., and Iriki, A. (2001). Bilateral activity and callosal connections in the somatosensory cortex. Neuroscientist 7, 419–46. doi: 10.1177/107385840100700511
Johnson, K. O., and Hsiao, S. S. (1992). Neural mechanisms of tactual form and texture perception. Annu. Rev. Neurosci. 15, 227–250. doi: 10.1146/annurev.ne.15.030192.001303
Jones, E. (1908). The precise diagnostic value of allochiria. Brain 30, 490–532. doi: 10.1093/brain/30.4.490
Jung, P., Klein, J. C., Wibral, M., Hoechstetter, K., Bliem, B., Lu, M. -K., et al. (2012). Spatiotemporal dynamics of bimanual integration in human somatosensory cortex and their relevance to bimanual object manipulation. J. Neurosci. 32, 5667–5677. doi: 10.1523/JNEUROSCI.5957-11.2012
Kanno, A., Nakasato, N., Hatanaka, K., and Yoshimoto, T. (2003). Ipsilateral area 3b responses to median nerve somatosensory stimulation. NeuroImage 18, 169–177. doi: 10.1006/nimg.2002.1283
Kanno, A., Nakasato, N., Nagamine, Y., and Tominaga, T. (2004). Non-transcallosal ipsilateral area 3b responses to median nerve stimulus. J. Clin. Neurosci. 11, 868–871. doi: 10.1016/j.jocn.2004.01.007
Keizer, A., Smeets, M. A. M., Dijkerman, H. C., van den Hout, M., Klugkist, I., van Elburg, A., et al. (2011). Tactile body image disturbance in anorexia nervosa. Psychiatry Res. 190, 115–120. doi: 10.1016/j.psychres.2011.04.031
Keizer, A., Smeets, M. A. M., Dijkerman, H. C., van Elburg, A., and Postma, A. (2012). Aberrant somatosensory perception in Anorexia Nervosa. Psychiatry Res. 200, 530–537. doi: 10.1016/j.psychres.2012.05.001
Kennett, S., Taylor-Clarke, M., and Haggard, P. (2001). Noninformative vision improves the spatial resolution of touch in humans. Curr. Biol. 11, 1188–1191. doi: 10.1016/S0960-9822(01)00327-X
Kinsbourne, M., and Warrington, E. K. (1962). A study of finger agnosia. Brain 85, 47–66. doi: 10.1093/brain/85.1.47
Kóbor, I., Füredi, L., Kovács, G., Spence, C., and Vidnyánszky, Z. (2006). Back-to-front: improved tactile discrimination performance in the space you cannot see. Neurosci. Lett. 400, 163–167. doi: 10.1016/j.neulet.2006.02.037
Körding, K. P., and Wolpert, D. M. (2004). Bayesian integration in sensorimotor learning. Nature 427, 244–247. doi: 10.1038/nature02169
Kuroki, S., Watanabe, J., Kawakami, N., Tachi, S., and Nishida, S. (2010). Somatotopic dominance in tactile temporal processing. Exp. Brain Res. 203, 51–62. doi: 10.1007/s00221-010-2212-8
Lackner, J. R. (1988). Some proprioceptive influences on the perceptual representation of body shape and orientation. Brain J. Neurol. 111, 281–297. doi: 10.1093/brain/111.2.281
Lakatos, S., and Shepard, R. N. (1997). Time-distance relations in shifting attention between locations on one’s body. Percept. Psychophys. 59, 557–566. doi: 10.3758/BF03211864
Le Cornu Knight, F., Cowie, D., and Bremner, A. J. (2017). Part-based representations of the body in early childhood: evidence from perceived distortions of tactile space across limb boundaries. Dev. Sci. 20:e12439 doi: 10.1111/desc.12439
Le Cornu Knight, F., Longo, M. R., and Bremner, A. J. (2014). Categorical perception of tactile distance. Cognition 131, 254–262. doi: 10.1016/j.cognition.2014.01.005
Ley, P., Bottari, D., Shenoy, B. H., Kekunnaya, R., and Röder, B. (2013). Partial recovery of visual-spatial remapping of touch after restoring vision in a congenitally blind man. Neuropsychologia 51, 1119–1123. doi: 10.1016/j.neuropsychologia.2013.03.004
Leyrer, M., Linkenauger, S. A., Bülthoff, H. H., and Mohler, B. J. (2015). The importance of postural cues for determining eye height in immersive virtual reality. PLoS One 10:e0127000. doi: 10.1371/journal.pone.0127000
Limanowski, J., and Blankenburg, F. (2016). Integration of Visual and Proprioceptive Limb Position Information in Human Posterior Parietal, Premotor, and Extrastriate Cortex. J. Neurosci. 36, 2582–2589. doi: 10.1523/JNEUROSCI.3987-15.2016
Linkenauger, S. A., Bülthoff, H. H., and Mohler, B. J. (2015). Virtual arm’s reach influences perceived distances but only after experience reaching. Neuropsychologia 70, 393–401. doi: 10.1016/j.neuropsychologia.2014.10.034
Linkenauger, S. A., Ramenzoni, V., and Proffitt, D. R. (2010). Illusory shrinkage and growth: body-based rescaling affects the perception of size. Psychol. Sci. 21, 1318–1325. doi: 10.1177/0956797610380700
Linkenauger, S. A., Wong, H. Y., Geuss, M., Stefanucci, J. K., McCulloch, K. C., Bülthoff, H. H., et al. (2014). The perceptual homunculus: the perception of the relative proportions of the human body. J. Exp. Psychol. Gen. 144, 103–113. doi: 10.1037/xge0000028
Lipton, M. L., Fu, K.-M. G., Branch, C. A., and Schroeder, C. E. (2006). Ipsilateral hand input to area 3b revealed by converging hemodynamic and electrophysiological analyses in macaque monkeys. J. Neurosci. 26, 180–185. doi: 10.1523/JNEUROSCI.1073-05.2006
Lohmann, J., and Butz, M. V. (2017). Lost in space: multisensory conflict yields adaptation in spatial representations across frames of reference. Cogn. Process. 18, 211–228. doi: 10.1007/s10339-017-0798-5
Longo, M. R. (2015). Implicit and explicit body representations. Eur. Psychol. 20, 6–15. doi: 10.1027/1016-9040/a000198
Longo, M. R. (2017a). Distorted body representations in healthy cognition. Q. J. Exp. Psychol. 70, 378–388. doi: 10.1080/17470218.2016.1143956
Longo, M. R. (2017b). Hand posture modulates perceived tactile distance. Sci. Rep. 7:9665. doi: 10.1038/s41598-017-08797-y
Longo, M. R., Azañón, E., and Haggard, P. (2010). More than skin deep: body representation beyond primary somatosensory cortex. Neuropsychologia 48, 655–668. doi: 10.1016/j.neuropsychologia.2009.08.022
Longo, M. R., Ghosh, A., and Yahya, T. (2015a). Bilateral symmetry of distortions of tactile size perception. Perception 44, 1251–1262. doi: 10.1177/0301006615594949
Longo, M. R., and Golubova, O. (2017). Mapping the internal geometry of tactile space. J. Exp. Psychol. Hum. Percept. Perform. 43, 1815–1827. doi: 10.1037/xhp0000434
Longo, M. R., and Haggard, P. (2009). Sense of agency primes manual motor responses. Perception 38, 69–78. doi: 10.1068/p6045
Longo, M. R., and Haggard, P. (2010). An implicit body representation underlying human position sense. Proc. Natl. Acad. Sci. U. S. A. 107, 11727–11732. doi: 10.1073/pnas.1003483107
Longo, M. R., and Haggard, P. (2011). Weber’s illusion and body shape: anisotropy of tactile size perception on the hand. J. Exp. Psychol. Hum. Percep. Perfor. 37, 720–726. doi: 10.1037/a0021921
Longo, M. R., and Haggard, P. (2012a). A 2.5-D representation of the human hand. J. Exp. Psychol. Hum. Percept. Perform. 38, 9–46. doi: 10.1037/a0025428
Longo, M. R., and Haggard, P. (2012b). Implicit body representations and the conscious body image. Acta Psychol. 141, 164–46. doi: 10.1016/j.actpsy.2012.07.015
Longo, M. R., and Lourenco, S. F. (2007). Space perception and body morphology: extent of near space scales with arm length. Exp. Brain Res. 177, 285–290. doi: 10.1007/s00221-007-0855-x
Longo, M. R., Mancini, F., and Haggard, P. (2015b). Implicit body representations and tactile spatial remapping. Acta Psychol. 160, 77–87. doi: 10.1016/j.actpsy.2015.07.002
Longo, M. R., and Sadibolova, R. (2013). Seeing the body distorts tactile size perception. Cognition 126, 475–481. doi: 10.1016/j.cognition.2012.11.013
Lourenco, S. F., Longo, M. R., and Pathman, T. (2011). Near space and its relation to claustrophobic fear. Cognition 119, 448–453. doi: 10.1016/j.cognition.2011.02.009
Macaluso, E. (2006). Multisensory processing in sensory-specific cortical areas. Neurosci. Rev. J. Bringing Neurobiol. Neurol. Psychiatry 12, 327–338. doi: 10.1177/1073858406287908
Maravita, A., and Iriki, A. (2004). Tools for the body (schema). Trends Cogn. Sci. 8, 79–86. doi: 10.1016/j.tics.2003.12.008
Marzi, C. A. (1999). The Poffenberger paradigm: a first, simple, behavioural tool to study interhemispheric transmission in humans. Brain Res. Bull. 50, 421–422. doi: 10.1016/S0361-9230(99)00174-4
Mauguière, F., Merlet, I., Forss, N., Vanni, S., Jousmäki, V., Adeleine, P., et al. (1997). Activation of a distributed somatosensory cortical network in the human brain: a dipole modelling study of magnetic fields evoked by median nerve stimulation. Part II: effects of stimulus rate, attention and stimulus detection. Electroencephalogr. Clin. Neurophysiol. 104, 290–295. doi: 10.1016/S0013-4694(97)00018-7
Medina, J., and Rapp, B. (2008). Phantom tactile sensations modulated by body position. Curr. Biol. 18, 1937–1942. doi: 10.1016/j.cub.2008.10.068
Meermann, R. (1983). Experimental investigation of disturbances in body image estimation in anorexia nervosa patients, and ballet and gymnastics pupils. Int. J. Eat. Disord. 2, 91–100. doi: 10.1002/1098-108X(198322)2:4<91::AID-EAT2260020416>3.0.CO;2-Z
Melzack, R., and Bromage, P. R. (1973). Experimental phantom limbs. Exp. Neurol. 39, 261–269. doi: 10.1016/0014-4886(73)90228-8
Miller, L. E., Cawley-Bennett, A., Longo, M. R., and Saygin, A. P. (2017a). The recalibration of tactile perception during tool use is body-part specific. Exp. Brain Res. 235, 2917–2926. doi: 10.1007/s00221-017-5028-y
Miller, L. E., Longo, M. R., and Saygin, A. P. (2014). Tool morphology constrains the effects of tool use on body representations. J. Exp. Psychol. Hum. Percept. Perform. 40, 2143–2153. doi: 10.1037/a0037777
Miller, L. E., Longo, M. R., and Saygin, A. P. (2016). Mental body representations retain homuncular shape distortions: evidence from Weber’s illusion. Conscious. Cogn. 40, 17–25. doi: 10.1016/j.concog.2015.12.008
Miller, L. E., Longo, M. R., and Saygin, A. P. (2017b). Visual illusion of tool use recalibrates tactile perception. Cognition 162, 32–40. doi: 10.1016/j.cognition.2017.01.022
Mölbert, S. C., Sauer, H., Dammann, D., Zipfel, S., Teufel, M., Junne, F., et al. (2016). Multimodal body representation of obese children and adolescents before and after weight-loss treatment in comparison to normal-weight children. PLoS One 11:e0166826. doi: 10.1371/journal.pone.0166826
Moro, V., Zampini, M., and Aglioti, S. M. (2004). Changes in spatial position of hands modify tactile extinction but not disownership of contralesional hand in two right brain-damaged patients. Neurocase 10, 437–443. doi: 10.1080/13554790490894020
Moscovitch, M., and Behrmann, M. (1994). Coding of spatial information in the somatosensory system: evidence from patients with neglect following parietal lobe damage. J. Cogn. Neurosci. 6, 151–155. doi: 10.1162/jocn.1994.6.2.151
Nelson, A. J., and Chen, R. (2008). Digit somatotopy within cortical areas of the postcentral gyrus in humans. Cereb. Cortex 18, 2341–2351. doi: 10.1093/cercor/bhm257
Nissen, H. W., Chow, K. L., and Semmes, J. (1951). Effects of restricted opportunity for tactual, kinesthetic, and manipulative experience on the behavior of a chimpanzee. Am. J. Psychol. 64, 485–507. doi: 10.2307/1418189
Obersteiner, H. (1881). On allochiria: a peculiar sensory disorder. Brain 4, 153–163. doi: 10.1093/brain/4.2.153
Ogden, J. A. (1985). Autotopagnosia. Occurrence in a patient without nominal aphasia and with an intact ability to point to parts of animals and objects. Brain 108, 1009–1022. doi: 10.1093/brain/108.4.1009
Overvliet, K. E., Azañón, E., and Soto-Faraco, S. (2011). Somatosensory saccades reveal the timing of tactile spatial remapping. Neuropsychologia 49, 3046–3052. doi: 10.1016/j.neuropsychologia.2011.07.005
Pagel, B., Heed, T., and Röder, B. (2009). Change of reference frame for tactile localization during child development. Dev. Sci. 12, 929–937. doi: 10.1111/j.1467-7687.2009.00845.x
Pandya, D. N., and Vignolo, L. A. (1969). Interhemispheric projections of the parietal lobe in the rhesus monkey. Brain Res. 15, 49–65. doi: 10.1016/0006-8993(69)90309-6
Penfield, W., and Boldrey, E. (1937). Somatic motor and sensory representation in the cerebral cortex of man as studied by electrical stimulation. Brain 60, 389–443. doi: 10.1093/brain/60.4.389
Pierson, J. M., Bradshaw, J. L., Meyer, T. F., Howard, M. J., and Bradshaw, J. A. (1991). Direction of gaze during vibrotactile choice reaction time tasks. Neuropsychologia 29, 925–928. doi: 10.1016/0028-3932(91)90056-E
Pisella, L., Gréa, H., Tilikete, C., Vighetto, A., Desmurget, M., Rode, G., et al. (2000). An “automatic pilot” for the hand in human posterior parietal cortex: toward reinterpreting optic ataxia. Nat. Neurosci. 3, 729–736. doi: 10.1038/76694
Poffenberger, A. T. (1912). Reaction time to retinal stimulation with special reference to the time lost in conduction through nerve centers. Arch. Psychol. 23, 1–73.
Pouget, A., Deneve, S., and Duhamel, J.-R. (2002). A computational perspective on the neural basis of multisensory spatial representations. Nat. Rev. Neurosci. 3, 741–747. doi: 10.1038/nrn914
Powell, T. P., and Mountcastle, V. B. (1959). Some aspects of the functional organization of the cortex of the postcentral gyrus of the monkey: a correlation of findings obtained in a single unit analysis with cytoarchitecture. Bull. Johns Hopkins Hosp. 105, 133–162.
Prablanc, C., Echallier, J. E., Jeannerod, M., and Komilis, E. (1979). Optimal response of eye and hand motor systems in pointing at a visual target. II. Static and dynamic visual cues in the control of hand movement. Biol. Cybern. 35, 183–187. doi: 10.1007/BF00337063
Pritchett, L. M., Carnevale, M. J., and Harris, L. R. (2012). Reference frames for coding touch location depend on the task. Exp. Brain Res. 222, 437–445. doi: 10.1007/s00221-012-3231-4
Ramachandran, V. S., Rogers-Ramachandran, D., and Cobb, S. (1995). Touching the phantom limb. Nature 377, 489–490. doi: 10.1038/377489a0
Reed, J. L., Qi, H.-X., and Kaas, J. H. (2011). Spatiotemporal properties of neuron response suppression in owl monkey primary somatosensory cortex when stimuli are presented to both hands. J. Neurosci. 31, 3589–3601. doi: 10.1523/JNEUROSCI.4310-10.2011
Reed, J. L., Qi, H.-X., Zhou, Z., Bernard, M. R., Burish, M. J., Bonds, A. B., et al. (2010). Response properties of neurons in primary somatosensory cortex of owl monkeys reflect widespread spatiotemporal integration. J. Neurophysiol. 103, 2139–2157. doi: 10.1152/jn.00709.2009
Rigato, S., Begum Ali, J., van Velzen, J., and Bremner, A. J. (2014). The neural basis of somatosensory remapping develops in human infancy. Curr. Biol. 24, 1222–1226. doi: 10.1016/j.cub.2014.04.004
Rinker, M. A., and Craig, J. C. (1994). The effect of spatial orientation on the perception of moving tactile stimuli. Percept. Psychophys. 56, 356–362. doi: 10.3758/BF03209769
Ro, T., Wallace, R., Hagedorn, J., Farnè, A., and Pienkos, E. (2004). Visual enhancing of tactile perception in the posterior parietal cortex. J. Cogn. Neurosci. 16, 24–30. doi: 10.1162/089892904322755520
Roberts, R. D., Wing, A. M., Durkin, J., and Humphreys, G. W. (2003). Effects of posture on tactile temporal order judgements. Eurohaptics 300–46.
Röder, B., Rösler, F., and Spence, C. (2004). Early vision impairs tactile perception in the blind. Curr. Biol. 14, 121–124. doi: 10.1016/j.cub.2003.12.054
Röder, B., Spence, C., and Rösler, F. (2002). Assessing the effect of posture change on tactile inhibition-of-return. Exp. Brain Res. 143, 453–462. doi: 10.1007/s00221-002-1019-7
Romano, D., Marini, F., and Maravita, A. (2017). Standard body-space relationships: fingers hold spatial information. Cognition 165, 105–112. doi: 10.1016/j.cognition.2017.05.014
Romo, R., Lemus, L., and de Lafuente, V. (2012). Sense, memory, and decision-making in the somatosensory cortical network. Curr. Opin. Neurobiol. 22, 914–919. doi: 10.1016/j.conb.2012.08.002
Rossetti, Y., Desmurget, M., and Prablanc, C. (1995). Vectorial coding of movement: vision, proprioception, or both? J. Neurophysiol. 74, 457–463. doi: 10.1152/jn.1995.74.1.457
Rossetti, Y., Stelmach, G., Desmurget, M., Prablanc, C., and Jeannerod, M. (1994). The effect of viewing the static hand prior to movement onset on pointing kinematics and variability. Exp. Brain Res. 101, 323–330. doi: 10.1007/BF00228753
Rusconi, E., Tamè, L., Furlan, M., Haggard, P., Demarchi, G., Adriani, M., et al. (2014). Neural correlates of finger gnosis. J. Neurosci. 34, 9012–9023. doi: 10.1523/JNEUROSCI.3119-13.2014
Sakata, H., Takaoka, Y., Kawarasaki, A., and Shibutani, H. (1973). Somatosensory properties of neurons in the superior parietal cortex (area 5) of the rhesus monkey. Brain Res. 64, 85–102. doi: 10.1016/0006-8993(73)90172-8
Sathian, K. (2000). Intermanual referral of sensation to anesthetic hands. Neurology 54, 1866–1868. doi: 10.1212/WNL.54.9.1866
Scarpina, F., Castelnuovo, G., and Molinari, E. (2014). Tactile mental body parts representation in obesity. Psychiatry Res. 220, 960–969. doi: 10.1016/j.psychres.2014.08.020
Schady, W. J., Torebjörk, H. E., and Ochoa, J. L. (1983). Cerebral localisation function from the input of single mechanoreceptive units in man. Acta Physiol. Scand. 119, 277–285. doi: 10.1111/j.1748-1716.1983.tb07338.x
Schwoebel, J., and Coslett, H. B. (2005). Evidence for multiple, distinct representations of the human body. J. Cogn. Neurosci. 17, 543–553. doi: 10.1162/0898929053467587
Shen, G., Smyk, N. J., Meltzoff, A. N., and Marshall, P. J. (2018). Neuropsychology of human body parts: exploring categorical boundaries of tactile perception using somatosensory mismatch responses. J. Cogn. Neurosci. 30, 1858–1869. doi: 10.1162/jocn_a_01313
Shontz, F. C. (1969). Perceptual and cognitive aspects of body experience. San Diego, CA: Academic Press.
Shore, D. I., Gray, K., Spry, E., and Spence, C. (2005). Spatial modulation of tactile temporal-order judgments. Perception 34, 1251–46. doi: 10.1068/p3313
Shore, D. I., Spry, E., and Spence, C. (2002). Confusing the mind by crossing the hands. Cogn. Brain Res. 14, 153–163. doi: 10.1016/S0926-6410(02)00070-8
Sirigu, A., Grafman, J., Bressler, K., and Sunderland, T. (1991). Multiple representations contribute to body knowledge processing. Evidence from a case of autotopagnosia. Brain 114, 629–642. doi: 10.1093/brain/114.1.629
Smania, N., and Aglioti, S. (1995). Sensory and spatial components of somaesthetic deficits following right brain damage. Neurology 45, 1725–1730. doi: 10.1212/WNL.45.9.1725
Soto-Faraco, S., Ronald, A., and Spence, C. (2004). Tactile selective attention and body posture: assessing the multisensory contributions of vision and proprioception. Percept. Psychophys. 66, 1077–1094. doi: 10.3758/BF03196837
Spitoni, G. F., Serino, A., Cotugno, A., Mancini, F., Antonucci, G., and Pizzamiglio, L. (2015). The two dimensions of the body representation in women suffering from Anorexia Nervosa. Psychiatry Res. 230, 181–188. doi: 10.1016/j.psychres.2015.08.036
Stone, K. D., Keizer, A., and Dijkerman, H. C. (2018). The influence of vision, touch, and proprioception on body representation of the lower limbs. Acta Psychol. 185, 22–32. doi: 10.1016/j.actpsy.2018.01.007
Sur, M., Merzenich, M. M., and Kaas, J. H. (1980). Magnification, receptive-field area, and “hypercolumn” size in areas 3b and 1 of somatosensory cortex in owl monkeys. J. Neurophysiol. 44, 295–311. doi: 10.1152/jn.1980.44.2.295
Sutherland, M. T. (2006). The hand and the ipsilateral primary somatosensory cortex. J. Neurosci. 26, 8217–8218. doi: 10.1523/JNEUROSCI.2698-06.2006
Tajadura-Jiménez, A., Väljamäe, A., Toshima, I., Kimura, T., Tsakiris, M., and Kitagawa, N. (2012). Action sounds recalibrate perceived tactile distance. Curr. Biol. 22, R516–517. doi: 10.1016/j.cub.2012.04.028
Tamè, L., Braun, C., Holmes, N. P., Farnè, A., and Pavani, F. (2016). Bilateral representations of touch in the primary somatosensory cortex. Cognitive Neuropsychol. 33, 1–19. doi: 10.1080/02643294.2016.1159547
Tamè, L., Braun, C., Lingnau, A., Schwarzbach, J., Demarchi, G., Li Hegner, Y., et al. (2012). The contribution of primary and secondary somatosensory cortices to the representation of body parts and body sides: an fMRI adaptation study. J. Cogn. Neurosci. 24, 2306–2320. doi: 10.1162/jocn_a_00272
Tamè, L., Carr, A., and Longo, M. R. (2017a). Vision of the body improves inter-hemispheric integration of tactile-motor responses. Acta Psychol. 175, 21–27. doi: 10.1016/j.actpsy.2017.02.007
Tamè, L., Dransfield, E., Quettier, T., and Longo, M. R. (2017b). Finger posture modulates structural body representations. Sci. Rep. 7:43019. doi: 10.1038/srep43019
Tamè, L., Farnè, A., and Pavani, F. (2011). Spatial coding of touch at the fingers: insights from double simultaneous stimulation within and between hands. Neurosci. Lett. 487, 78–82. doi: 10.1016/j.neulet.2010.09.078
Tamè, L., Farnè, A., and Pavani, F. (2013). Vision of the body and the differentiation of perceived body side in touch. Cortex 49, 1340–1351. doi: 10.1016/j.cortex.2012.03.016
Tamè, L., and Holmes, N. P. (2016). Involvement of human primary somatosensory cortex in vibrotactile detection depends on task demand. NeuroImage 138, 184–196. doi: 10.1016/j.neuroimage.2016.05.056
Tamè, L., and Longo, M. R. (2015). Inter-hemispheric integration of tactile-motor responses across body parts. Front. Hum. Neurosci. 9:345. doi: 10.3389/fnhum.2015.00345
Tamè, L., Moles, A., and Holmes, N. P. (2014). Within, but not between hands interactions in vibrotactile detection thresholds reflect somatosensory receptive field organization. Front. Psychol. 5:174. doi: 10.3389/fpsyg.2014.00174
Tamè, L., Pavani, F., Papadelis, C., Farnè, A., and Braun, C. (2015). Early integration of bilateral touch in the primary somatosensory cortex. Hum. Brain Mapp. 36, 1506–1523. doi: 10.1002/hbm.22719
Tamè, L., Wühle, A., Petri, C. D., Pavani, F., and Braun, C. (2017c). Concurrent use of somatotopic and external reference frames in a tactile mislocalization task. Brain Cogn. 111, 25–33. doi: 10.1016/j.bandc.2016.10.005
Tan, H. -R. M., Wühle, A., and Braun, C. (2004). Unilaterally applied stimuli in a frequency discrimination task are represented bilaterally in primary somatosensory cortex. Neurol. Clin. Neurophysiol. 2004:83.
Taylor-Clarke, M., Jacobsen, P., and Haggard, P. (2004). Keeping the world a constant size: object constancy in human touch. Nat. Neurosci. 7, 219–220. doi: 10.1038/nn1199
Thompson, J. K., and Thompson, C. M. (1986). Body size distortion and self-esteem in asymptomatic, normal weight males and females. Int. J. Eat. Disord. 5, 1061–1068. doi: 10.1002/1098-108X(198609)5:6<1061::AID-EAT2260050609>3.0.CO;2-C
Tinazzi, M., Ferrari, G., Zampini, M., and Aglioti, S. M. (2000). Neuropsychological evidence that somatic stimuli are spatially coded according to multiple frames of reference in a stroke patient with tactile extinction. Neurosci. Lett. 287, 133–136. doi: 10.1016/S0304-3940(00)01157-5
Tipper, S. P., Lloyd, D., Shorland, B., Dancer, C., Howard, L. A., and McGlone, F. (1998). Vision influences tactile perception without proprioceptive orienting. Neuroreport 9, 1741–1744. doi: 10.1097/00001756-199806010-00013
Tommerdahl, M., Simons, S. B., Chiu, J. S., Favorov, O., and Whitsel, B. L. (2006). Ipsilateral input modifies the primary somatosensory cortex response to contralateral skin flutter. J. Neurosci. 26, 5970–5977. doi: 10.1523/JNEUROSCI.5270-05.2006
Tsakiris, M., and Haggard, P. (2005). The rubber hand illusion revisited: visuotactile integration and self-attribution. J. Exp. Psychol. Hum. Percept. Perform. 31, 80–91. doi: 10.1037/0096-1523.31.1.80
Vallar, G. (1997). Spatial frames of reference and somatosensory processing: a neuropsychological perspective. Philos. Trans. R. Soc. B 352, 1401–1409. doi: 10.1098/rstb.1997.0126
Warren, W. H., and Whang, S. (1987). Visual guidance of walking through apertures: body-scaled information for affordances. J. Exp. Psychol. Hum. Percept. Perform. 13, 371–383. doi: 10.1037/0096-1523.13.3.371
Weber, E. H. (1996). De subtilitate tactus. in E. H. Weber on the tactile senses. eds. H. E., Ross, and D. J., Murray 2nd Edn. London: Academic Press. (Original work pubilshed 1834). pp. 21–128.
Wühle, A., Preissl, H., and Braun, C. (2011). Cortical processing of near-threshold tactile stimuli in a paired-stimulus paradigm–an MEG study. Eur. J. Neurosci. 34, 641–651. doi: 10.1111/j.1460-9568.2011.07770.x
Keywords: somatosensory processing, space, body representation, laterality, body shape
Citation: Tamè L, Azañón E and Longo MR (2019) A Conceptual Model of Tactile Processing across Body Features of Size, Shape, Side, and Spatial Location. Front. Psychol. 10:291. doi: 10.3389/fpsyg.2019.00291
Edited by:
Matej Hoffmann, Czech Technical University in Prague, CzechiaReviewed by:
Elisa Magosso, University of Bologna, ItalyAshley R. Drew, University of Washington, United States
Copyright © 2019 Tamè, Azañón and Longo. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Luigi Tamè, luigi.tame@gmail.com
Elena Azañón, eazanyon@gmail.com
†These authors have contributed equally to this work and share first authorship