- 1mindSPACE Laboratory, Departments of Cognitive Sciences and Language Science (by Courtesy), Center for Hearing Research, University of California, Irvine, Irvine, CA, United States
- 2mindSPACE Laboratory, Department of Cognitive Sciences, University of California, Irvine, Irvine, CA, United States
Cortical processing pathways for sensory information in the mammalian brain tend to be organized into topographical representations that encode various fundamental sensory dimensions. Numerous laboratories have now shown how these representations are organized into numerous cortical field maps (CMFs) across visual and auditory cortex, with each CFM supporting a specialized computation or set of computations that underlie the associated perceptual behaviors. An individual CFM is defined by two orthogonal topographical gradients that reflect two essential aspects of feature space for that sense. Multiple adjacent CFMs are then organized across visual and auditory cortex into macrostructural patterns termed cloverleaf clusters. CFMs within cloverleaf clusters are thought to share properties such as receptive field distribution, cortical magnification, and processing specialization. Recent measurements point to the likely existence of CFMs in the other senses, as well, with topographical representations of at least one sensory dimension demonstrated in somatosensory, gustatory, and possibly olfactory cortical pathways. Here we discuss the evidence for CFM and cloverleaf cluster organization across human sensory cortex as well as approaches used to identify such organizational patterns. Knowledge of how these topographical representations are organized across cortex provides us with insight into how our conscious perceptions are created from our basic sensory inputs. In addition, studying how these representations change during development, trauma, and disease serves as an important tool for developing improvements in clinical therapies and rehabilitation for sensory deficits.
1 Introduction
Topographical representations of sensory information are emerging as a fundamental organizational pattern for perceptual processing across sensory cortex in numerous mammalian species (Kaas, 1997; Wandell et al., 2005; Krubitzer, 2007; Sanchez-Panchuelo et al., 2010; Barton et al., 2012; Prinster et al., 2017; Yushu Chen et al., 2021). Organized topographies within sensory pathways are thought to support the comparison and combination of the information carried by the various specialized neuronal populations. To enhance the brain’s ability to discriminate among different stimuli, sensory neurons that respond to similar features are frequently organized into distinct clusters or columns, and their response characteristics exhibit smooth transitions across the cortical surface. The orderly connectivity arising from such organization is likely important for increasing the efficiency of such local processes as lateral inhibition and gain control and may provide a framework for sensory processing across the sensory hierarchy (Mitchison, 1991; Van Essen, 2003; Chklovskii and Koulakov, 2004; Shapley et al., 2007; Moradi and Heeger, 2009).
In human, the historically most-studied sensory topography is the representation of visual space in the visual system (Van Essen, 2003). Visual cortex contains multiple regions in which neurons are organized with respect to the neural arrangement of the retina, where neighboring photoreceptors respond to neighboring regions of visual space (Wandell et al., 2007). This organization serves as a map of visual space, also known as a visual field map (VFM), which repeats as an organizational pattern from the retina into higher-order visual processing (Engel et al., 1994; Sereno et al., 1995; DeYoe et al., 1996; Engel et al., 1997). Representing the fundamental visual dimensions of eccentricity (i.e., center-to-periphery) and polar angle (i.e., around-the-clock), a VFM is one form of a sensory cortical field map (CFM), a region which encodes at least two primary sensory dimensions (Figure 1; Engel et al., 1997; Wandell et al., 2007). More recent studies have revealed complete CFMs in human auditory cortex, with auditory field maps (AFMs) tiling human primary auditory core and belt regions, and partial CFM topographies in somatosensation (touch/pain) and gustation (taste), suggesting that CFMs serve as the building blocks of sensory processing (Murthy, 2011; Barton et al., 2012; Ma et al., 2012; Mancini et al., 2012; Brewer and Barton, 2016b; Prinster et al., 2017; Sanchez Panchuelo et al., 2018; Saadon-Grosman et al., 2020b; Willoughby et al., 2020). Understanding the characteristics of these CFMs, together with knowledge of the stimulus selectivity of the neurons within them, provides the foundation for understanding the specific computations carried out in particular sensory systems.
Figure 1. Definition of cortical field maps. (A) Schematics depict the two orthogonal dimensions that are required to define a cortical field map. (i) The graph of one sensory dimension (e.g., eccentricity; tonotopy) demonstrates measurements of three stimulus values—1: low (L, red); 2: medium (M, green); 3: high (H, blue). (ii) The graph of a second sensory dimension (e.g., polar angle; periodotopy) demonstrates measurements of three stimulus values—1: low (L, orange); 2: medium (M, cyan); 3: high (H, purple). (B) (i) Schematic depicts a single set of orthogonal gradients composing one CFM—one for each dimension in (A). (ii) Schematic here demonstrates how a reversal in the dimension-2 gradient representations (right) divides up the single representation of the dimension-1 gradient (left) into two CFMs. Gray dotted lines show the boundary defined by the dimension-2 gradient reversal, and arrows denote the low-to-high gradients. (C) (i) In order for each voxel/portion of the CFM to represent a unique combination of dimension 1 and dimension 2 values, the two gradients composing a CFM must be orthogonal. In this case, measurements along the cortical representation of a single value (e.g., green, “M”) of dimension 1 span all values of dimension 2 (right), and vice versa (left). (ii) Diagram demonstrates how vectors drawn from centers of low-stimulus-value regions of interest (ROIs) to high-stimulus-value ROIs for each dimension should have an offset of approximately 90° in a CFM.
2 Cortical field map overview
2.1 CFM characteristics
Accurate identification of individual CFMs is essential for parsing the individual computational stages of sensory processing. Several characteristics are necessary to establish that a particular cortical representation is a CFM and define its borders. First, the topographical representations of each sensory dimension should be organized as an orderly gradient covering a contiguous range of that dimension (Figures 1A,B; Brewer and Barton, 2018). Such a topographical gradient typically represents one aspect of either a peripheral sensing organ (i.e., visual eccentricity across the retina or auditory tonotopy along the basilar membrane of the inner ear) or another important dimension of sensory features (i.e., periodicity in audition). While care must be taken to correctly identify these gradients in fMRI measurements, such organized responses are exceedingly unlikely to emerge in fMRI measurements by chance (Figure 2; for further discussion, see Barton and Brewer, 2017).
Figure 2. Organized orthogonal gradients of sensory representations are unlikely to occur by chance. (A) The square schematic represents a 7 × 7 matrix of voxels, in which each row represents one sensory gradient (red on the left through the rainbow to magenta on the right) evenly distributed across a piece of cortical surface (left). Each color is supposed to represent a stimulus value spanning 1/7 of the full stimulus space for one dimension (e.g., eccentricity in vision or tonotopy in audition), with the lowest value of stimulus dimension 1 coded as red, and the highest value coded as magenta. For an example in the visual domain, the red squares would then represent voxels with a preference for eccentricity from fixation of 0.00°–1.57° of visual angle for an 11°—radius visual stimulus, green would be 4.71°–6.28°, and brown would be 9.43°–11.00°. No random noise has been added to this matrix; the colored squares represent a perfectly organized topographic map in cortex (Note that this is a level of perfection that does not exist in biological systems). Schematic square now represents the second, orthogonal dimension of the same sensory space (e.g., polar angle in vision or periodicity in audition; right). Organized and orthogonal gradients must be present for at least 2 sensory dimensions like this for the definition of a cortical field map. Note the regular gradient is still present with no noise, running from the low value of dimension 2 in checkered red (bottom of square) to checkered magenta (top of square). (B) Schematics of the same sensory gradients are now depicted more naturally with some noise added in. The random noise has been set so that if a voxel should represent a particular 1/7th of the stimulus range in the gradient [as seen in (A)], it can with equal probability represent an adjacent color. In other words, the “true” value falls somewhere within 3/7th of the stimulus range, centered on the correct value: if a voxel should be yellow in a perfect representation, the noise level would allow it to be orange, yellow, or green with equal probability. The overall direction of each orthogonal gradient is still mostly visible despite the noise. (C) Now the colors have been randomized so that each voxel can with equal probability represent 5/7th of the stimulus range, centered on the correct value (e.g., if a voxel should be yellow, it could be red, orange, yellow, green, or blue, with equal probability). The low-to-high directions of the two gradients are not very apparent, but there is a loose grouping of lower and higher stimulus values on each side. (D) Each voxel has now been randomly assigned to any of the 7 colors with equal probability. No stimulus gradient structure is present. Adapted from Barton and Brewer (2017), licensed under CC BY.
The representation of an individual sensory dimension often can appear as a wide swath of topographical responses across a region of sensory cortex. For example, the representation of visual eccentricity along the occipital pole spreads across the region as a contiguous, apparently unified gradient. Without other markers, it is impossible to determine how such a representation would be divided into the individual CFMs that tile this region and contribute to specific sensory processing steps. Thus, it is important to distinguish between a single topographical gradient and a complete CFM. Simply guessing at how to divide up a single topographical gradient based on factors like anatomical location, data averaged across subjects, or diagrams from homologous monkey data to complete the perpendicular boundaries, as has frequently been done for tonotopic measurements in human auditory cortex, is not at all sufficient in most cases (for detailed review, see Brewer and Barton, 2016b; for example exceptions, see Formisano et al., 2003; Hinds et al., 2008; Benson et al., 2012). A CFM must be defined by the presence of two overlapping topographical gradients that each represent a different, orthogonal sensory dimension – i.e., visual eccentricity and polar angle or auditory tonotopy and periodotopy (Figure 1; Brewer et al., 2005; Wandell et al., 2007; Barton et al., 2012; Brewer and Barton, 2012). Figure 1A demonstrates a schematic of two orthogonal representations that form a single CFM (Figure 1Bi), while Figure 1Bii shows how a matching representation of dimension 2 can be divided into two CFMs based on the reversal of dimension 1 at the dotted gray line. The measurement of a single gradient across a region of cortex could thus denote a single CFM or many CFMs. As the number of overlapping orthogonal gradients increases, the determination of the CFM organization grows increasingly complex. Cortical regions may even be composed of topographical gradients representing several sensory dimensions, such as the representations of spatial frequency and orientation selectivity that are present in primary visual cortex along with the retinotopic representations of visual space. Thus any two sets of representations of orthogonal sensory dimensions at minimum can be used to define a CFM.
In addition to its two-dimensional (2D) orthogonality (Figure 1C), the internal topography of each CFM should be non-repeating; the computation for a particular CFM should be performed across a single region of the sensory domain (Sereno et al., 1995; DeYoe et al., 1996; Press et al., 2001; Wandell et al., 2007; Barton et al., 2012). Similarly, each CFM should represent a considerable portion of the sensory dimensions, although increases in the magnification of specific parts of sensory space, like seen for the central fovea in the visual system, and limitations in measurement resolution may both reduce the measurable range. Finally, while some variation is expected across individuals, the basic overall layout and composition of CFMs should be reliably consistent. CFMs even in low-order visual and auditory cortex can differ substantially in size and anatomical location, but the overall arrangement of adjacent CFMs should be maintained across individuals (Galaburda and Sanides, 1980; Rademacher et al., 1993; Morosan et al., 2001; Rademacher et al., 2001; Schonwiesner et al., 2002; Dougherty et al., 2003; Brewer et al., 2005; Wandell et al., 2007; Clarke and Morosan, 2012).
2.2 CFM boundaries
Using these characteristics to accurately define the boundaries of specific CFMs is key for isolating individual stages of sensory processing and for localizing matching regions across individual subjects that can then be examined more accurately on a group level. The boundaries of repeating, adjacent gradients of one sensory dimension can be determined along the points where the gradient reverses its representation of sensory space (Figure 3). At a gradient reversal, stimulus values represented along the cortical surface increase from low to high (or vice versa) across one CFM to the boundary and then reverse back from high to low (or vice versa) in the next CFM. The boundaries between CFMs are typically drawn to evenly divide the reversals between the two maps, unless additional functional data suggests an alternative approach (e.g., data from a different localizer measurement, like visual motion or face localizers; Press et al., 2001; Brewer et al., 2005; Larsson et al., 2006; Arcaro et al., 2009; Larsson et al., 2010). CFM data are classically visualized as colors overlaid on the cortical sheet that are matched to corresponding stimulus values (Sereno et al., 1995; DeYoe et al., 1996; Engel et al., 1997). Underlying the specific colors are numerical values that are each associated with a specific stimulus, so both manual and automatic approaches to CFM border definitions do not usually rely on just perceived changes in color hue but verify these color changes with the underlying data values (Wandell et al., 2005). If a sensory representation exists in isolation—set apart from other contiguous sensory representations that have already been measured—then the borders may be the edges of the overlapping sensory gradients. In this case, there will likely be some blurring or spreading of the representation along the edges, so special care must be taken in these measurements to not overestimate the extent of the isolated CFM (Engel et al., 1994, 1997; Brewer and Barton, 2012). The definition of the very edge of a CFM therefore may have some inaccuracies, but the affected region should only involve the voxels just along the border. As a result, many studies of sensory CFMs remove the voxels along the border from analyses of the internal CFM organization and functional responses to avoid accidentally incorporate voxels from a neighboring CFM in the analysis (Baseler et al., 2002; Brewer et al., 2005; Baseler et al., 2011; Binda et al., 2013).
Figure 3. Cortical field map boundary definitions. (A) Diagram represents the organization of a series of gradients of one sensory dimension (e.g., polar angle—vision; periodotopy—auditory) along a flattened cortical surface. Black arrows denote the gradient directions—low (orange) to medium (cyan) to high (purple). Dashed yellow lines mark gradient “reversals” that are used to define the boundaries between individual cortical field maps. (B) The schematic illustrates how gradient boundaries for one dimension of a CFM are defined at sections where the gradient reverses direction. Hypothetical measurement points along the cortical surface of a region of interest (ROI) are shown as black dots. Black arrows demonstrate the low-to-high direction of each gradient, and dotted yellow lines mark the reversals that separate the data points into four separate gradients (G1, G2, G3, G4).
The identification of the exact boundaries between CFMs has historically relied mainly on manual determination of the gradient reversals by experts in those specific sensory measurements. When measured by researchers with extensive practice and attention to the stimulus values along the gradient reversal, expert manual definitions have been very reliable across studies in the visual system, the sensory system with the most research devoted to these measurements in the human brain (Sereno et al., 1995; DeYoe et al., 1996; Dougherty et al., 2003; Wandell et al., 2005, 2007). Manual border definitions can also more easily adapt to individual differences in CFM organization and the general “biological noise” observed among CMF measurements (e.g., CFM size differences, map rotations) among individuals (Winawer et al., 2010; Brewer and Barton, 2012; Barton and Brewer, 2017).
The incorporation of more objective approaches for identifying CFM borders is still highly desired, so many groups studying the visual system combine manual border definitions for VFM data with various automated algorithms to aid in their final VFM border determinations (Sereno et al., 1995; Dougherty et al., 2003; Brewer et al., 2005; Larsson and Heeger, 2006). Current algorithms for VFMs are typically applied to both orthogonal eccentricity and polar-angle dimensions simultaneously and often can setup estimations of internal map organization (e.g., iso-eccentricity and iso-angle lines in VFMs). They utilize such approaches as determining the visual-field sign of adjacent VFMs (i.e., mirror vs. non-mirror image representations) or minimizing the error between an expected visual map and the observed data (i.e., atlas-fitting; Sereno et al., 1995; Dougherty et al., 2003; Brewer et al., 2005). The former is best applied to the well-established concentric VFMs of the early visual areas (e.g., V1–V3) and similar maps that abut each other at a polar-angle reversal, while the latter requires careful manual positioning of the initial atlas within the measured data. All the algorithms for VFM definitions that we are aware of to date do best with some level of prior knowledge about the expected pattern of CFM organization, which makes the application of automated algorithms without manual help to the measurement of novel VFMs or new CFMs in the other sensory systems very difficult. This is an area of research ripe for future expansion.
Within the defined CFM boundaries, the orthogonality of the topographical representations should also be assessed, as this is critical to create a topography that uniquely represents sensory feature space. If the topographical gradients for each of these dimensions were parallel instead of perpendicular, the representation of visual space that they would form would be only a spiraling sliver rather than the complete coverage of the visual field that the orthogonal orientation provides (Tyler and Wade, 2005). The orthogonality of the two dimensions can be verified by showing that measurements along the cortical representation of a single value of the first dimension span all values of the second dimension, and vice versa (Figure 1Ci). Each gradient is identified as a series of adjacent vectors which share a trajectory from low to high stimulus values along the cortical surface, though they differ between CFMs in overall size as well as in magnification of ranges of stimulus representation (Baseler et al., 2002; Larsson and Heeger, 2006; Barton et al., 2012; Brewer and Barton, 2016b). Thus, orthogonality can be estimated by measuring the direction of each gradient using a series of vectors drawn either manually or automatically from low-to-high stimulus values and then measuring the angle between the vectors for each gradient (Figure 1Cii; Kolster et al., 2009, 2010; Barton et al., 2012; Brewer and Barton, 2016b; Barton and Brewer, 2017). Due to noise factors within the biological system and measurement limitations of fMRI, some skewing off of 90 degrees can be tolerated, but the two dimensions should still never be parallel (e.g., Larsson and Heeger, 2006). More complex measurements of the fidelity of each gradient and their orthogonality within a CFM have been attempted; for example, researchers in the visual system have used atlas-fitting algorithms to compare the “goodness of fit” of expected representations for a given VFM to the data (Brewer et al., 2005; Dougherty et al., 2005). However, such an approach requires the manual identification of many CFMs in many subjects to have the statistical power necessary, and is not currently feasible for individual CFMs in individual subjects.
2.3 Cloverleaf clusters of CFMs
On a larger scale across cortex, CFMs across the majority of human visual and auditory cortex studied to date are organized into patterns called cloverleaf clusters (Brewer et al., 2005; Wandell et al., 2005, 2007; Kolster et al., 2010; Barton et al., 2012; Brewer and Barton, 2012; Barton and Brewer, 2017). This macrostructural pattern has also been observed in the visual system of the macaque (Kolster et al., 2009). Figure 4A depicts the arrangement of the two topographical gradients that compose the CFMs of a single cloverleaf cluster. The “cloverleaf” term comes from the pattern of CFMs arranged within a cloverleaf cluster like the leaves of a clover plant. Dimension 1 is organized such the sensory topography moves from low to high along concentric, circular bands (e.g., visual eccentricity and auditory tonotopy), with an orthogonal dimension 2 then arranged as repeating gradients running in radial bands from the center to the periphery of the representation of dimension 1 like spokes on a wheel (e.g., visual polar angle and auditory periodotopy; Brewer et al., 2005; Wandell et al., 2007; Barton et al., 2012; Brewer and Barton, 2016b). Reversals in dimension 2 divide a single cluster into individual CFMs, while reversals in dimension 1 serve as boundaries among cloverleaf clusters (Figure 4B). This macrostructural pattern is now described as being radially orthogonal (Brewer and Barton, 2012).
Figure 4. Cloverleaf cluster organization. (A) Diagram depicts the representation of one sensory dimension (e.g., eccentricity—vision; tonotopy—auditory) across a flattened region of the cortical surface, with low (red) to medium (green) to high (blue) stimulus values represented in concentric circles (left). Diagram depicts the representation of a second sensory dimension (e.g., polar angle—vision; periodotopy—auditory) across the same region of the cortical surface, with low (orange) to medium (cyan) to high (purple) stimulus values represented in wedges running “around the clock” (right). Four cortical field maps are defined by these orthogonal gradients and arranged in a cloverleaf cluster (Kolster et al., 2009, 2010; Barton et al., 2012; Brewer and Barton, 2012). Dotted lines denote the boundaries defined by gradient reversals (black/white circle: dimension 1 edge; yellow line: dimension 2 reversal). (B) Diagram shows how three cloverleaf clusters each composed of four CFMs can be organized across a region of cortex. Gradient representations for dimension 1 (left) and dimension 2 (right) are shown on what would be the same overlapping section of the cortical sheet. Dotted black/white circles mark the edge of each cluster (C1–3: Cluster 1–3). Neighboring clusters meet at gradient reversals of high (blue) dimension-1 representations (dotted white/black circles). Boundaries between individual CFMs within each cluster are again marked with dotted yellow lines.
The spatial organization of cloverleaf clusters is reminiscent of the organization of orientation pinwheels at a smaller spatial scale, with both consisting of smoothly changing representations that appear to blend together across swaths of cortex (Grinvald et al., 1986; Bonhoeffer and Grinvald, 1991; Maldonado et al., 1997; Ohki et al., 2006). Grouping together neurons with similar selectivity in this way is likely to not only help minimize axonal connectivity to optimize energetic efficiency, but also to influence synaptic integration and coordinate neural computations (Schummers et al., 2002; Chklovskii and Koulakov, 2004; Shapley et al., 2007; Moradi and Heeger, 2009). It is thus thought that neurons within each cluster share common computational resources, such as short-term information storage, or coordinate neural timing across the sensory hierarchy (Press et al., 2001; Brewer et al., 2005; Wandell et al., 2005; Barton et al., 2012; Barton and Brewer, 2017; Landi et al., 2021; Qasim et al., 2021). Perceptual specializations, such as visual processing of color or motion, similarly appear to be mostly organized by clusters of CFMs rather than individual CFMs (Zeki and Bartels, 1999; Bartels and Zeki, 2000; Brewer et al., 2005; Brewer and Barton, 2018). The MT cluster with homologous organizations in human (TO or hMT+) and macaque (MT+) is an excellent illustration of this cluster-based perceptual processing (Wandell et al., 2007; Amano et al., 2009; Kolster et al., 2009, 2010). The MT+ cluster in macaque is composed of four VFMs—MT, MST, FST, and V4t, all of which contribute to unique stages of visual motion perception (Kolster et al., 2009). The hMT+ cluster in human similarly contains 4 VFMs involved in visual motion, although the specific homologies to the macaque VFMs are still under study (Huk et al., 2002; Kolster et al., 2010; Brewer and Barton, 2012). Such cloverleaf cluster organization of CFMs likely reflects how multiple stages in a sensory processing pathway might arise through evolution.
It will be interesting for future research to determine how widespread the cloverleaf cluster organization is across the senses. VFMs in the frontal lobe such as the frontal eye fields (FEF) and the regions in the dorsolateral prefrontal cortex (DLPFC) appear in currently published data to be isolated retinotopic hemifield representations that are not organized into cloverleaf clusters, but there are emerging reports from preliminary data that additional maps are present in these regions as well that may be organized into clusters (Hagler and Sereno, 2006; Saygin and Sereno, 2008; Silver and Kastner, 2009). The small number of AFMs that have been measured in human auditory core and belt do appear to be organized into cloverleaf clusters, but we know little yet of the topographical representations of auditory dimensions that likely extend along the lateral fissure (Barton et al., 2012; Brewer and Barton, 2016b). While cloverleaf clusters have not yet been observed in our current measurements of the somatosensory system or the chemical senses, we have only limited measurements of the associated topographical representations for each in the human brain. A more complete understanding of the extent of cloverleaf-cluster organization will be important for insight into how such topographical representations evolved across the senses and among species (Krubitzer, 2007; Wandell and Smirnakis, 2009).
2.4 CFM comparisons across cortex and species
As our study of CFMs expands across cortex and species, it is useful to keep in mind the possible ways that these representations may be changing under evolutionary pressures. Evolution is ongoing, continually molding organisms as their environments change. The organization and functional specialization of CFMs and cloverleaf clusters are unlikely to have reached an evolutionary endpoint, so the cortical sensory representations that we are measuring may not be perfectly organized or may show specific types of variations across individuals or species (for detailed discussion, see Krubitzer, 2007). Consideration of these types of changes can help to improve our localization of specific CFMs across individuals, our identification of new CFMs in various sensory systems, and our recognition of the homologies across species.
Figures 5–7 demonstrate several changes that CFMs may be undergoing across individuals, species or sensory domains (Krubitzer and Seelke, 2012). The overall size of a CFM may vary across individuals or species (Figures 5A,B), or there may be changes in the cortical magnification of specific parts of the internal topography of a CFM that correlate with differences in sensory experiences or perceptual needs (Figure 5C). In the human visual system, for example, primary visual cortex (V1) can vary by at least a factor of three in surface area, independent of overall brain size (Dougherty et al., 2003). Research is still exploring how these differences in V1 size correlate with differences in visual behavior and sensory sensitivity. Along these lines, Schwarzkopf and Rees (2013) found that illusory size perception can be influenced by differences in the cortical magnification of the central foveal representation in V1. More complex changes in internal topography can also arise among individuals or species, such as the emergence of small modules or sub-maps within a section of a CFM (Figure 5D). The appearance of such sub-topographies may reflect adaptations driven by early developmental differences or experience in particular individuals or may be the result of mutations that could eventually lead to the emergence of new cortical maps within a particular sensory system.
Figure 5. Potential changes within cortical field maps over evolution. Schematic diagrams depict several ways cortical field maps can change over the course of evolution, important for consideration of potential homology of CFMs among species, individuals, and sensory cortices (see Krubitzer and Seelke, 2012 for extended discussion). Each schematic shows two pictures of the same CFM, one for each orthogonal dimension (e.g., dimension 1: visual eccentricity; dimension 2: visual polar angle). (A) Example of the baseline CFM with 3 colors coded for representations of the low (L), middle (M), and high (H) sensory values for each orthogonal dimension. Subsequent schematics show changes with respect to this initial CFM. (B) Overall size of CFM may be reduced. (C) The magnification of a particular part of the internal representations [e.g., middle value (M)] may increase for dimension 1 (i) and/or dimension 2 (ii). (D) New representations may be in the process of emerging or combining within a complete CFM. (i) Additional segments of high-value (H; blue) representations of dimension 1 are present within the medium-value (M; green) representations. (ii) Additional segments of medium-value (M; cyan) representations of dimension 2 are present within the high-value (H; purple) and low-value (L; orange) representations. (iii) A smaller complete CFM exists within the larger CFM. Other details are as in Figure 4.
Figure 6. Potential changes in cloverleaf clusters over evolution. (A) (i) Schematic depicts the two orthogonal gradients for a cloverleaf cluster composed of two CFMs. (ii) Evolution of additional CFMs within the cluster could alter the internal structure of the cluster to now be composed of 4 CFMs. (B) New clusters of CFMs may emerge in adjoining regions, with the cortical-sheet territory around one cluster (i) expanding to include more distinct clusters (ii). Such expansions of CFMs within a cluster or of clusters themselves may correlate with expansions in the related sensory behaviors. M1–4 = map 1–4; C1–4 = cluster 1–4. Other details are as in Figures 4, 5.
Figure 7. Potential large-scale brain changes over evolution. (A) The amount of cortical territory devoted to one sensory system may cede territory to another in conjunction with the expansion or reduction of their related behaviors. Cartoons of a left hemisphere are shown with colored overlays representing hypothetical regions of cortex devoted to auditory processing (blue) and visual processing (red). In (i), more cortex is devoted to visual than auditory processing, while the opposite would be true in (ii). CS, central sulcus (purple); LS, lateral sulcus (green); STS, superior temporal sulcus (orange); OP, occipital pole (*). (B) 3-D inflated renderings of a human brain (i) and macaque brain (ii) are scaled for approximate relative size and shown at the white-gray boundary to demonstrate the changes in the size of the cortical sheet over 25 million years of evolution between the two species (Hedges and Kumar, 2003). An increase in overall cortical sheet size could accommodate expansions of CFMs, such as those depicted in Figures 5, 6, and lead to correlated increases in the complexity of associated behaviors. Anatomical-directions legend: S, superior; I, inferior; P, posterior; A, anterior.
On a larger scale, there may be changes in the numbers of CFMs and/or their cloverleaf clusters across a region of the cortical sheet or devoted to a particular perceptual processing pathway (Figure 6). An expansion in the number of CFMs may underlie an expansion in perceptual abilities, with new CFMs supporting new aspects of behavior. For example, visual object recognition in human arises from a large swath of cortex that contains numerous VFMs that are thought to support various aspects of visual object processing: e.g., hV4; VO-1, VO-2; PHC-1, PHC-2; LO-1, LO-2 (Wade et al., 2002; Brewer et al., 2005; Wandell et al., 2005; Larsson and Heeger, 2006; Larsson et al., 2006; Montaser-Kouhsari et al., 2007; Arcaro et al., 2009). In comparison, the homologous regions in macaque monkey are relatively much smaller, comprising such areas as TEO and V4 (Desimone and Schein, 1987; Gattass et al., 1988; Boussaoud et al., 1991; Tanaka et al., 1991; Nakamura et al., 1993; Brewer et al., 2002). With ~25 million years of evolution separating humans and macaques, it is not surprising that we see differences in the complexity of visual object processing that are likely associated with the similar differences in the complexity of object use (Hedges and Kumar, 2003; for additional discussion, see DiCarlo et al., 2012). One can imagine genetic duplications reminiscent of the homeobox genes involved in body-structure patterning or the eph/ephrin pairs driving topographical connectivity that could underlie the expansion of cloverleaf clusters of CFMs and their associated behaviors; additional CFM clusters may emerge through genetic duplications and thus provide an increase in cortical territory available to support a more complex range of behaviors in a particular sensory processing stream (Crawford, 2003; Kmita and Duboule, 2003; Holland and Takahashi, 2005; Lappin et al., 2006). Such expansions in cortical territory and associated behavioral complexity are indeed observed across sensory systems and among many species (Figure 7; Krubitzer, 2007; Krubitzer and Seelke, 2012).
3 Measurement techniques for cortical field maps
3.1 Phase-encoded fMRI: using traveling-wave stimulation to measure CFMs
One of the gold standards for measuring CFMs in human using fMRI is a phase-encoded paradigm that relies on a stimulus sequentially activating regions across sensory space (Figure 8; Engel et al., 1994; Sereno et al., 1995; DeYoe et al., 1996; Sanchez-Panchuelo et al., 2010; Barton et al., 2012; Mancini et al., 2012; Kolasinski et al., 2016a). “Phase-encoded” refers to the tie between the cortical activation and the periodic sensory stimulus; as the stimulus moves through sensory space, neural activity increases within the corresponding cortical sensory representations. With repetitions of the stimulus movement, the neural activity within the associated cortical representations is modulated in sync with the stimulus repetition. The cortical response is matched to its sensory topography through its correlation to the timing, or phase, of the stimulus presentation.
Figure 8. Phase-encoded neuroimaging paradigms for cortical field mapping. (A) Schematic presents an example paradigm for measuring visual field maps using phase-encoded fMRI. (i) Typical stimuli used for visual field mapping are composed of black and white moving checkerboard patterns on a neutral grey background, as show for the expanding ring stimulus. A 2-second presentation of this example visual stimulus, stimulating one central position in visual space (i.e., one stimulus phase) is represented by the striped orange bar. For visual stimuli, scanner acquisition occurs simultaneously with the stimulus presentation (Engel et al., 1994; Wandell et al., 2007; Brewer and Barton, 2012). (ii) One full stimulus cycle consists of several blocks of the visual stimulus stepping through visual space. Each phase of the expanding ring stimulus is displayed above the blocks; one block thus represents one stimulus position in the ‘phase-encoded’ sequence. The six striped-orange blocks together compose one stimulus cycle (cyan bar). The term ‘travelling wave’ is also used to describe this type of stimulus presentation, as the stimuli produce a sequential activation of representations across a topographically organized cortical region. (iii) A full, single scan to measure VFMs is then composed of a number of cycles of the stimulus moving through visual space (e.g., 6 cycles shown in cyan). (B) Schematic presents an example paradigm for measuring auditory field maps using phase-encoded fMRI. (i) The top diagram shows how the auditory stimulus presentation (striped orange bar) is separated from the noise of the scanner acquisition (solid orange bar) in a phase-encoded, sparse-sampling fMRI paradigm (Petkov et al., 2009). The delayed timing of the acquisition collects the peak cortical response to the auditory stimulus, in accordance with the approximate hemodynamic delay. (ii) Typical stimuli used for auditory field mapping consist of a series of tones, frequencies, or noise bands (e.g., narrow-band noise for tonotopy and broad-band noise for periodotopy), as shown in the gray table. Each stimulus block is composed of a single tone or noise band and the scanner acquisition period. The diagram shows 6 blocks (striped orange + solid orange) of consecutive frequency ranges grouped together into one stimulus cycle (cyan bar). (iii) The diagram again shows a full, single scan comprising 6 cycles. (C) Schematic presents an example paradigm for measuring somatosensory field maps using phase-encoded fMRI. (i) Stimuli used for somatosensory field mapping of the fingertips to date have been composed of sequential stimulation of the fingertips by piezo-electric stimulators for vibrotactile sensation, air puffs for light touch, or radiant-heat lasers for pain. A stimulus block consists of the stimulation (orange striped bar) and a null period. (ii) Phase-encoded measurements again step through the sensory space (e.g., each fingertip) over one stimulus cycle (cyan bar). (iii) As for the other sensory modalities, a single scan consists of multiple cycles, e.g., five in this example. Note color legend in inset.
To measure one sensory dimension with this paradigm, like eccentricity in vision or tonotopy in audition, a set of stimulus values is presented in an orderly sequence across a range of interest. In the visual system, commonly used stimuli include expanding rings and rotating wedges that are used to measure the dimensions of visual eccentricity and polar angle, respectively. Such retinotopic stimuli are typically composed of a moving checkerboard pattern, which is designed to maximize the response of primary visual cortex (V1; Engel et al., 1994; Sereno et al., 1995; DeYoe et al., 1996; Wandell et al., 2007). To measure visual eccentricity, for example, the expanding ring stimulus would start as a small disc in the center of the field of view (i.e., at the fixation point) and would sequentially step out as a narrow annulus from the center out to the visual periphery (Engel et al., 1997). This range would constitute one stimulus cycle (Figure 8A). Over a single scan, the expanding ring would repeat this movement several times to increase the power of the measurement. For tonotopic measurements, a set of frequencies would be presented in order from low to high for one stimulus cycle, for example, and this cycle would again repeat several times during one scan (Figure 8B; Talavage et al., 2004; Humphries et al., 2010; Barton et al., 2012; Brewer and Barton, 2016b). With auditory stimuli, the MR scanner noise must be taken into account as a possible source of contamination of the auditory signal. A sparse-sampling approach separates the auditory stimulus from the scanner acquisition noise by separating the two in time (Figure 8Bi; Bandettini et al., 1998; Scarff et al., 2004; Gaab et al., 2007; Petkov et al., 2009; Joly et al., 2014). Somatotopic measurements ideally would similarly arise from sequential activation over the entire skin or dermal zone of interest. Due to the complexity of such stimulation across such a large organ as the skin and within the MR environment, somatotopic tactile and pain measurements have so far been restricted to more selective sampling across a relatively restricted region, such as the fingertips or selected points across the body on the head, finger/hand, body, and foot/leg (Figure 8C; Ruben et al., 2001; Sanchez-Panchuelo et al., 2010; Mancini et al., 2012; Kolasinski et al., 2016a; Sanchez Panchuelo et al., 2018; Schellekens et al., 2018; Willoughby et al., 2020). For all of these approaches, the value of the stimulus that most effectively drives each cortical location—e.g., specific degrees of visual eccentricity, auditory frequency, or location on the skin—is then estimated from the pattern of neural responses.
With this phase-encoded experimental paradigm, only cortical regions that show a modulation of activity in sync with the stimulus modulation are included in the CFM analysis (Figures 9A,B; for extended discussions, see Wandell et al., 2005; Brewer and Barton, 2016b). Regions that are active at other, non-stimulus frequencies are not included in the measurement. So, for example, if a region responds to the presence of any visual stimulus anywhere in the visual field, that region will remain active throughout the visual stimulus presentation, rather than being active only when the stimulus moves through its spatially restricted zone in the visual field. Only those regions organized around a sensory topography will show phase-encoded activity in response to the traveling-wave stimulus.
Figure 9. Cortical field mapping analysis. (A) (i) Schematic measurements are shown for two different phases of an expanding-ring visual stimulus (orange = earlier phase; purple = later phase). Although there are on the order of ~1 million neurons within a typical voxel (ii) measured with 3 T MRI for cortical field mapping (Hoffmann et al., 2007; Brewer and Barton, 2016b), such neighboring neurons in topographically organized sensory cortex each have similarly tuned receptive fields (iii) (orange and purple circles with black outlines) with similar preferred centers of maximal response (black dots). Note how the overlapping receptive fields concentrate coverage in one region of sensory space corresponding to the average receptive field of the group. Phase-encoded measurements rely on this organization to estimate the average preferred center for the population of neurons in a given voxel. (B) Diagram displays two example phase-encoded time series with different stimulus responses arising from the orange and purple visual stimuli, respectively. Each plot shows the time series of a single 6-cycle scan of one type of experimental stimuli (e.g., expanding rings) for a single voxel. Note that, in phase-encoded paradigms, only BOLD responses that match the stimulus frequency in terms of cycles per scan are considered as data (Engel et al., 1994; Dumoulin and Wandell, 2008; Brewer and Barton, 2012). Simulated raw data points of percent blood-oxygen-level-dependent (BOLD) modulation (i.e., response amplitude) are indicated by the black dots, while the orange and purple dotted lines denote the sinusoidal fits for two example simulated datasets. Red lines indicate the peak activations per stimulus cycle for these two simulated voxel activations. The horizontal offset of the red lines between the orange and purple sinusoids indicates differences in stimulus selectivity for the populations of neurons in each voxel, as each example voxel is responding to a different stimulus phase. These activations that are encoded to the phase (timing/position) of the stimulus (hence, the term phase-encoded fMRI paradigm) are then represented by different colors in the pseudocolor overlays representing cortical field maps (see VFM schematic shown in Figure 10). Adapted from Brewer and Barton (2012). (C) Population receptive field (pRF) modeling was developed for visual field mapping in order improve measurements in higher-order visual cortex. This additional analysis allows for not only the measurement of the peak activation (i.e., preferred RF center) for a particular voxel as described in (A,B), but also for the measurement of the average pRF size for the population of neurons in a given voxel (Dumoulin and Wandell, 2008). The parameter estimation procedure for the pRF model is shown as a flow chart. PRF modeling has now also been adapted for measuring tonotopic gradients (Saenz and Langers, 2014; Thomas et al., 2015; Lage-Castellanos et al., 2023) and discrete somatotopic and motor topographical locations (Schellekens et al., 2018). Based on Figure 2 in Dumoulin and Wandell (2008).
3.2 Specialized approaches: population receptive field modeling
A more specialized, model-based approach has been developed to measure VFMs in human cortex using a range of visual stimuli that periodically move through visual space, including the traditional traveling-wave/phase-encoded measurements. This method can collect additional information about VFMs by modeling the population receptive field (pRF) of each voxel within a VFM (Figure 9C; for complete pRF-modeling details, see Dumoulin and Wandell, 2008; for examples of pRF-modeling applications, see Baseler et al., 2011; Haak et al., 2012; Barton and Brewer, 2015, 2017). Within an organized sensory topography, receptive fields (RFs) in each small voxel typically have such similar representations of visual space that the combined, average RF across the population of neurons within each voxel can be estimated as a single, 2D Gaussian. The pRF-modeling method thus provides an assessment of not only the preferred center for the pRF of each voxel, as is measured with phase-encoded mapping alone, but also its size. Although there is some variability in the neural RFs of each voxel in terms of their preferred centers and sizes, termed RF scatter, the pRF analysis provides a good, if somewhat slightly larger, estimate of the individual visual RFs in the voxel. Research is currently underway to develop similar pRF models for auditory and somatosensory field maps. To date, pRF modeling has been adapted to measuring tonotopic responses—one dimension of AFMs—in human primary auditory cortex (Saenz and Fine, 2010; Thomas et al., 2015; Lage-Castellanos et al., 2023) as well as modeling somatotopic and motor responses at discrete points across the fingers and body (Schellekens et al., 2018).
3.3 Considerations for data acquisition and analysis of CFM measurements
Obtaining a high-quality measurement of topographic cortical representations is a vital step in the accurate definition of CFMs and relies in part on the selection of appropriate phase-encoded stimuli. First, the sampling density of the stimulus values across sensory feature space heavily influences the precision of the CFM measurement. If, for example, a visual stimulus only activates the far periphery of the visual field, then the resulting VFM measurement will be skewed from the actual map (Wandell et al., 2005, 2007). While such a restricted field of view is a rather unlikely approach for visual field mapping, the issue becomes much more of a pressing problem when we do not have as clear an understanding of the sensory feature space as we do for visual space. If stimulus values are chosen that are not included within the topographic gradient in a particular CFM, then the attempted measurements will fail to reveal an organized topographical representation within the purported area of the CFM. Similarly, if the sampling density of the stimulus values is too coarse, the precision of the CFM measurements will be poor, because the gradients can only be estimated from the interpolation of just a few sampled responses. When only a few stimulus values are tested across a wide range of sensory space, e.g., only 0° and 90° of visual angle or 400 Hz and 64,000 Hz for auditory stimulation, many parts of the associated cortical representations will be only weakly activated, because no stimulus falls within their preferred stimulus selectivity (Barton et al., 2012; Brewer and Barton, 2016a). Consequently, the fMRI measurements at those cortical locations will be inaccurate, as they would be determined mainly by signals that spread from activity in the surrounding cortex that contains neurons with different stimulus preferences. The estimated stimulus preference for these regions will also be contingent to a much greater degree on the spatial spread of the blood-oxygenation-level-dependent (BOLD) signal that underlies the fMRI measurement (Engel et al., 1994, 1997). This spreading process basically blurs the data and is subject to other, variable characteristics of the brain, such as vascular density, that can add additional noise into the CFM measurements at that cortical location (for review, see Logothetis and Wandell, 2004; Winawer et al., 2010).
Similar issues can arise from distortions, signal dropouts, and other artifacts in the fMRI data that can be introduced through interactions between these non-invasive BOLD measurements and the adjacent anatomy and tissues of the head and neck (Logothetis, 2002; Logothetis and Wandell, 2004; Yu et al., 2023). For example, definitions of VFMs in the ventral visual pathway were controversial for many years due to inconsistencies in the measurements across individual subjects until it was shown that differences in the vascular pattern in the region could cause a venous eclipse in the data that erased the measurement of certain ventral VFMs in some subjects (e.g., hV4 Brewer et al., 2005; Winawer et al., 2010). In an ideal world, researchers could compare an image of the vascular system and BOLD data for each individual brain to account for such signal loss, but that is not currently feasible. Larger-scale structures such as dural venous sinuses and air cavities have hindered data collection of certain sensory data across the majority of subjects, leading to the general misinterpretation of sensory processing in the adjacent cortical regions (Zeki, 2003; Brewer et al., 2005; Du et al., 2007; Wandell et al., 2007). Measurements of human auditory cortex are far behind those of visual cortex in part due to distortions and signal loss introduced by the air pockets of the ear canal into fMRI data collected along the lateral fissure (Peelle, 2014; Talavage et al., 2014; Brewer and Barton, 2016b). Improvements in acquisition protocols and increased spatial resolution have now helped measurements overcome this issue for the most part. Regions near orbitofrontal cortex similarly are affected by signal loss caused by neighboring air cavities, thus limiting the measurements of higher-order regions of olfactory and gustatory processing (Hutton et al., 2002; Zelano and Sobel, 2005; Du et al., 2007). The brain anatomy itself can produce limits on the spatial resolution that can be obtained in certain regions, such as the closely abutting gyri of primary somatosensory cortex (S1) and primary motor cortex (M1) across the central sulcus (Penfield and Boldrey, 1937; Woolsey et al., 1979). Partial-voluming effects from single voxels combining data from gray matter on both sides of the sulcus has led to a comparable delay in our ability to properly define human S1 and M1 topographic organization with fMRI (Gonzalez Ballester et al., 2002; Duncan and Boynton, 2007; Besle et al., 2013; Sánchez-Panchuelo et al., 2014; Schellekens et al., 2018; Willoughby et al., 2020). The majority of these artifacts also differ across magnetic field strength, adding an additional layer of complexity (Maldjian et al., 1999; Benson et al., 2018; Morgan and Schwarzkopf, 2019). Choice of the appropriate distortion correction during MRI data collection and post-acquisition processing is therefore invaluable for these measurements.
Specific analysis approaches can also affect the ability to measure CFMs across sensory systems. Methods that reduce the spatial resolution are particularly prone to destroying or altering the topographical measurements composing a CFM. For example, smoothing phase-encoded measurements with a Gaussian kernel can destroy important internal topographical features within a larger CFM or miss a smaller CFM entirely (Brewer et al., 2005; Winawer et al., 2010). Issues with anatomical image analysis can also similarly obliterate CFM measurements. Segmentation of white matter from gray, commonly needed for individual-subject data analysis in particular, requires not only high-quality automated segmentation algorithms, but also careful researcher review and hand-editing to ensure that the cortical sheet is properly defined, especially along the peaks of the gyri and the depths of the sulci (Nestares and Heeger, 2000; Brewer et al., 2002). Otherwise, the topographical data will be inappropriately missing regions that fall at these anatomical regions or blurring regions together across two gyri, such as early measurements of primary somatosensory (S1) and motor (M1) cortices that often blended together responses within single voxels crossing the central sulcus (i.e., partial-volume effects; for discussion, see Gonzalez Ballester et al., 2002; Wandell et al., 2007; Sanchez-Panchuelo et al., 2010). The measurement of a CFM is thus much more significant than the failure to find a map, especially when a particular CFM is reliably found across most observers (Brewer et al., 2005; Wandell et al., 2005, 2007; Winawer et al., 2010; Brewer and Barton, 2012).
Furthermore, the accurate definition of CFM boundaries relies on the analysis of sensory measurements from individual subjects. Averaging topographical measurements across a group, especially by aligning the data to an average brain through such atlases as Talairach space (Talairach and Tournoux, 1988) or Montreal Neurological Institute (MNI) coordinates (Collins et al., 1994), typically introduces significant blurring into the data (Brewer et al., 2005; Wandell et al., 2005). The relationship between cortical anatomy and CFM functional responses is variable enough across individuals, that such group-averaging is likely to misalign the appropriate topographies with other CFMs or unrelated cortical regions (Dougherty et al., 2003). As a result, the gradients composing the CFMs may be inaccurate or even missing (Sereno et al., 1995; DeYoe et al., 1996; Engel et al., 1997; Wandell et al., 2007; Barton et al., 2012; Brewer and Barton, 2012; Baumann et al., 2015). As we expand CFM measurements across the senses, such factors need to be taken into careful consideration.
4 Topographical representations in human sensory cortex
Over the last century, extensive research has been dedicated to unraveling the intricate mechanisms that underlie sensory perception and their associated cortical topographies. Within the visual, auditory, and somatosensory systems, researchers have made significant strides in understanding how specialized receptors in peripheral sense organs transduce and analyze crucial physical properties of external stimuli and ultimately how this sensory information is organized across sensory cortex (e.g., Wandell et al., 2007; Barton et al., 2012; Brewer and Barton, 2012, 2016a; Sanchez Panchuelo et al., 2018; Willoughby et al., 2020). We can now reliably measure a number cortical field maps or organized topographies within these systems, as described in the following sections.
In contrast, the chemical senses of taste and olfaction present unique challenges when it comes to representing stimulus features in the brain (Imai et al., 2010; Murthy, 2011). Unlike measurable dimensions such as the spatial positions across the visual field and skin surface or the spatiotemporal frequencies within sound waves, which all more naturally lend themselves to spatial organization in the cortex, the molecules relevant to the chemical sense organs do not possess such continuous physical properties, except for their magnitude or intensity (Chaudhari and Roper, 2010; Ma et al., 2012). Instead, the quality of a chemical stimulus is determined by its chemical composition, which lacks variation along a common physical dimension across different substances. As a result, our understanding of the cortical representations of smell and taste remains substantially more limited than that of the other senses. Even so, organized topographies within the chemical senses are emerging as well (Chen et al., 2011, 2021, 2022; Prinster et al., 2017; Lodovichi, 2020).
4.1 Visual field maps
The spatial arrangement of a visual image is a critical aspect of our ability to recognize elements of our environments (Sereno et al., 1995; DeYoe et al., 1996; Engel et al., 1997; Wandell et al., 2005, 2007). While an image may still be identifiable despite alterations of such properties as its color, motion, contrast, or rotation, scrambling its spatial arrangement typically destroys our ability to identify or reconstruct the original image. This visual field spatial arrangement is encoded by the circuitry of the retina and then preserved and repeated through visual cortex to produce a unifying matrix of visuospatial organization throughout the visual processing hierarchy, despite the diverse computations being performed across regions (e.g., Van Essen, 2003; Wandell et al., 2007; Brewer and Barton, 2012). As cortex interprets different aspects of the visual image—such as its motion or orientation—the cortical circuitry is organized using receptive fields arranged within VFMs to preserve the critical spatial image information.
In lower-level VFMs, precise measurements are taken of low-level visual features in a particular retinal location, which are built up into more complicated localized representations as they are processed through the cortical hierarchy. Despite having large receptive fields, higher-order visual cortex may still maintain visuospatial organization by maintaining just enough dispersion of receptive field centers to allow for slightly different preferred tuning of responses to visual space (Lehky and Sereno, 2011). The presence of organized representations of visual space in higher-order regions can still allow for the stimulus size and position invariances frequently described across high-order object- and face-responsive visual regions, as such invariance can arise in regions simply with very large receptive fields (DiCarlo and Maunsell, 2003; Dumoulin and Wandell, 2008; Brewer and Barton, 2012; Haak et al., 2012; Barton and Brewer, 2017). Current research is demonstrating that the majority of higher-order visual areas are organized according to visual space, maintaining retinotopically organized, dispersed RF centers despite increasingly large RF sizes (Hagler and Sereno, 2006; Hagler et al., 2007; Kastner et al., 2007; Swisher et al., 2007; Konen and Kastner, 2008; Arcaro et al., 2009; Lehky and Sereno, 2011; Brewer and Barton, 2012; Lehky et al., 2015; Barton and Brewer, 2017).
Whether the spatial organization remains truly retinotopic or changes to a broader spatiotopic organization—one based on external space rather than retinal space—is still under investigation and cannot be determined with typical visual-field-mapping methods (Sereno et al., 2001; Sereno and Huang, 2006; Hagler et al., 2007; Kastner et al., 2007). In either case, such widespread preservation of visuospatial organization allows for a common reference frame through which information can be passed up or down the visual hierarchy. Theories of attention in which higher-order visual-attentional areas are able to affect many lower-level visual areas simultaneously in spatially specific patterns can be explained through the use of such visual-location-based “channels” (e.g., Sereno et al., 2001; Silver et al., 2005; Saygin and Sereno, 2008; Lauritzen et al., 2009; Silver and Kastner, 2009; Szczepanski et al., 2010). It is also possible that visuospatial organization is maintained despite visual-location information not being critical to the computations of that specific area simply because it would be too disruptive or costly during development to change the organization once it has been established at the level of the retina and earlier visual cortex.
Human visual cortex includes the entire occipital lobe and extends significantly into the parietal and temporal lobes (Figure 10), composing about 20% of cortex (Wandell et al., 2007). The medial wall of occipital cortex in each hemifield contains four hemifield representations of visual space known as V1, V2, V3, and hV4 (for detailed reviews, see Wandell et al., 2007; Brewer and Barton, 2012). V1 consistently occupies the calcarine sulcus, bounded on either side by the split-hemifield representations of V2 and V3 on the lingual gyrus and cuneus. Human V4 (designated hV4 because of the unclear homology to macaque V4) is positioned as a complete hemifield on the ventral occipital surface adjacent to ventral V3 along the posterior fusiform gyrus (Langner et al., 2002; Brewer et al., 2005). These four VFMs compose the medial aspect of the occipital pole cluster (OP cluster), which supports low-level visual computations (Brewer et al., 2005; Wandell et al., 2005; Brewer and Barton, 2012).
Figure 10. Visual field maps have been defined across much of human visual cortex. (A) (i) A left hemisphere from an individual subject is shown as a 3-D inflated rendering in which light gray indicates gyri and dark gray indicates sulci. The positions of several VFM cloverleaf clusters measured in this individual subject are shown along the lateral surface as colored ROIs: orange, OP cluster (occipital pole cluster, lateral subdivision including LO-1, LO-2, LOC; Wandell et al., 2005; Larsson and Heeger, 2006; Brewer and Barton, 2012); red, TO cluster (temporal occipital cluster; also known as hMT+ cluster, human medial temporal complex; Wandell et al., 2005, 2007; Amano et al., 2009; Kolster et al., 2009, 2010; Barton et al., 2012); yellow, pSTS cluster (posterior superior temporal sulcus cluster; Barton and Brewer, 2017); cyan, V3A/B cluster (visual areas 3A and 3B cluster; Press et al., 2001; Wandell et al., 2005; Barton and Brewer, 2017); purple, regions along the dorsal cortex (intraparietal sulcus Schluppeck et al., 2005; Swisher et al., 2007; Silver and Kastner, 2009; Szczepanski et al., 2010) and ventral cortex (fusiform and parahippocampal gyri; Arcaro et al., 2009; Kolster et al., 2010) that are currently under investigation (for reviews, see Wandell et al., 2007; Barton et al., 2012). CS, central sulcus; LS, lateral sulcus; STS, superior temporal sulcus; *OP, occipital pole. Anatomical-directions legend: S, superior; I, inferior; A, anterior; P, posterior. (ii) ROIs along the medial surface of the same 3-D-rendered left hemisphere are displayed here, with clusters that span medial and lateral cortex matched in color: orange, OP cluster (medial subdivision including V1, V2, V3, hV4; Wandell et al., 2005; Brewer and Barton, 2012); green, VO cluster (ventral occipital; Brewer et al., 2005; Wandell et al., 2005); cyan, V3A/B cluster; dark blue, V6/6A cluster (Pitzalis et al., 2015); purple, dorsal and ventral regions currently under investigation. POS: parietal-occipital sulcus; CalS: calcarine sulcus. Other details are as in (i). (B) Diagram displays eccentricity representations within VFM clusters viewed along a flattened left hemisphere. Color overlays represent the position in visual space that produces the strongest response at that cortical location. Published clusters are labeled in colors corresponding to ROI colors in (A). Regions with cloverleaf clusters currently under investigation or only partially defined are shown in speckled gray with purple labels: IP, intraparietal; VLO, ventral lateral occipital; PHC, parahippocampal cortex. Red anatomical labels: LOC, lateral occipital cortex; FuG, fusiform gyrus; IPS, intraparietal sulcus. Central “*” marks the occipital pole. (C) Diagram of polar angle representations viewed on the same schematic of the flattened left hemisphere. Individual VFMs are labeled in black. Blue-magenta textured circles along IPS indicate cortical regions where polar angle representations have been measured, but reliably consistent eccentricity gradients have not yet been published. Other details are as in (B). Bottom inset shows eccentricity color legend (left); approximate anatomical directions for the schematics in (B,C) (middle); and polar-angle color legend (right). Anatomical-directions legend: S, superior; I, inferior; L, lateral; M, medial.
Because it receives direct inputs from the retino-geniculate pathway, V1 is considered to be primary visual cortex and is an important site of basic calculations such as orientation, color, and motion (Shapley et al., 2007). Each computation is performed across the entire visual field, yet V1 appears at the level of fMRI measurements to consist of a single, contiguous representation of visual space (Engel et al., 1997; Brewer and Barton, 2012). In essence, V1 is composed of several maps overlaid on one another, each of which performs a single computation (i.e., separated maps for ocular dominance, orientation, and motion; Livingstone and Hubel, 1984; Mikami et al., 1986; Newsome et al., 1986; Movshon and Newsome, 1996; Horton et al., 1997; Koulakov and Chklovskii, 2001). In this arrangement, a very intricate mosaic of neurons subserving these computations allows for each computation to be performed over each portion of the visual field. These mosaics, including pinwheel orientation columns, blobs/interblobs, and ocular dominance columns, have a long history of investigations that are still ongoing (e.g., Livingstone and Hubel, 1984; Bartfeld and Grinvald, 1992; Ohki et al., 2006; Adams et al., 2007; Gilaie-Dotan et al., 2013). These computations divide up into more specialized processing of the visual image after V1, with V2 and hV4 supporting low-level color and form processing, respectively, and V3 playing a role in low-level motion computations (Merigan and Maunsell, 1993; Smith et al., 1998; Wade et al., 2002; Brewer et al., 2005; Larsson et al., 2006; Wandell et al., 2007).
V1, V2, V3, and hV4 each contain a foveal representation positioned at the occipital pole, with progressively more peripheral representations extending into more anteromedial cortex, forming complete eccentricity gradients (Figure 10B; e.g., Sereno et al., 1995; DeYoe et al., 1996; Engel et al., 1997; Wandell et al., 2007). The region where the individual foveal representations meet at the occipital pole is commonly referred to as the foveal confluence (Schira et al., 2009). Despite the apparent merging of these foveal representations into one confluent fovea at the scale of fMRI measurements of eccentricity gradients, distinct boundaries between V1, V2, V3, and hV4 have been shown to be present even within this most central foveal representation (Brewer et al., 2005; Schira et al., 2009, 2010).
The boundaries between each map are delineated by reversals in the polar angle gradients along the medial surface (Figure 10C; e.g., Sereno et al., 1995; DeYoe et al., 1996; Engel et al., 1997; Wandell et al., 2007). V1 has a contiguous polar angle gradient representing a full hemifield, while V2 and V3 have split-hemifield representations (i.e., quarterfields), which are named by their positions ventral or dorsal to V1: V2d, V2v, V3d, V3v. Because of their relatively consistent anatomical locations and unique concentric polar angle gradients, these three VFMs are typically the first landmarks identified in visual-field-mapping analyses (Engel et al., 1994; Sereno et al., 1995). However, as noted above, the surface areas of these three VFMs fluctuate significantly among individuals independent of overall brain size (Dougherty et al., 2003). While V1 is always located along the fundus and up the walls of the calcarine sulcus in normal individuals, an increase in V1 size will consequently shift the specific positions of V2 and V3 along the neighboring gyri and sulci. VFMs beyond V3, such as the contiguous hV4 hemifield, continue to shift variably along the cortical surface in accordance with variable individual VFM sizes (Brewer et al., 2005; Winawer et al., 2010).
This pattern of VFMs continues across most if not all visual cortex, with loose divisions of processing into dorsal and ventral streams for the perception of action and recognition, respectively (Figure 10; Mishkin and Ungerleider, 1982; Goodale and Milner, 1992; Van Essen, 2003; Lehky and Sereno, 2007; Gilaie-Dotan et al., 2013). Groups of VFMs are then organized into cloverleaf clusters that are now either completely or partially defined (Wandell et al., 2005; Kolster et al., 2010; Brewer and Barton, 2012). Within each cluster, eccentricity representations run from foveal representations at the center of the cluster to peripheral representations at the outskirts of the cluster. Thus, boundaries between clusters are defined as reversals in eccentricity representations (Figure 10B). Boundaries between VFMs within a cluster occur in reversals of polar angle representations, typically along a representation of the vertical meridian of visual space except for the split quarterfield dorsal/ventral maps of V2 and V3, which are divided along the horizontal meridian (Figure 10C).
Along these dorsal and ventral streams, the medial-occipital VFMs of V1, V2, V3, and hV4 combine with the lateral VFMs LO-1 and LO-2 and a small number of yet-undetermined VFMs to form the occipital pole (OP) cloverleaf cluster, centered on its namesake (Wandell et al., 2005; Larsson and Heeger, 2006; Larsson et al., 2006; Montaser-Kouhsari et al., 2007; Kolster et al., 2010; Brewer and Barton, 2012). While the medial maps are well-established areas involved in the early stages of visual processing, the lateral VFMs in this cluster are likely involved with various stages of processing for visual object recognition and are still under extensive study. Superior to the OP cluster is first the two-map V3A/B cluster along the transverse occipital sulcus that plays a role in mid-level motion processing (Tootell et al., 1997; Press et al., 2001; Barton and Brewer, 2017). Along the medial wall in this region anterior to V3A/B and V3d is the two-map putative cluster of V6 and V6A (Pitzalis et al., 2006, 2015). The V6 and V6A VFMs are thought to be involved in evaluating object distance during self-motion and planning pointing or reaching responses in response, respectively (Fattori et al., 2009; Pitzalis et al., 2010, 2013). Further superior/anterior to these regions along the inferior parietal sulcus (IPS) are several putative clusters that include VFMs currently called IPS-0 (or V7) to IPS-5 and SPL-1 (Sereno et al., 2001; Schluppeck et al., 2005; Silver et al., 2005; Kastner et al., 2007; Konen and Kastner, 2008; Lauritzen et al., 2009; Silver and Kastner, 2009; Brewer and Barton, 2018). These parietal VFMs overlie regions involved in attention and working memory, as well as various aspects of sensorimotor integration. Likely due to their roles in these cognitive processes, these IPS regions beyond V3A/B are increasingly affected by changes in attention, with VFMs often unable to be measured without the proper attentional controls included in the visual stimuli (Silver et al., 2005; Saygin and Sereno, 2008). In addition, the majority of the IPS maps do not yet have published eccentricity representations, so the final organizations of each cluster remains to be determined (Brewer and Barton, 2012).
Anterior to the OP cluster along the lateral surface is the four-map temporal occipital (TO) cluster, alternatively called the hMT+ complex or cluster, a key cortical region for visual motion processing (Huk et al., 2002; Amano et al., 2009; Kolster et al., 2010). Further anterior is the recently discovered four-VFM posterior superior temporal sulcus (pSTS) cluster that is likely involved in multisensory integration (Barton and Brewer, 2017). Inferior to the OP cluster is the ventral occipital (VO) cluster, which currently contains two measured VFMs (VO-1 and VO-2) in a likely set of four (Wade et al., 2002; Brewer et al., 2005) and processes higher-level visual form and color information. Finally, anterior to the VO cluster along the ventral surface is the parahippocampal cortex (PHC) cluster, which also has two currently measured VFMs (PHC-1 and PHC-2) that likely also form a group of four maps (Arcaro et al., 2009). The PHC cluster is thought to play a role in visual scene perception, consistent with role of this ventral stream region established from other measurements as well (Grill-Spector et al., 2017; Epstein and Baker, 2019).
Due to the long history of extensive research into VFMs in human and animal models as well as complications with neuroimaging measurements for the non-visual senses, we have a vastly better understanding of the organization of CFMs in the visual system than the other senses (Brewer and Barton, 2018). As such, the patterns we observe in visual cortex can serve to varying degrees as the foundation for our expectations in the other sensory systems. As we will review next, new research is starting to reveal similar topographical representations, CFMs, cloverleaf clusters, and/or dorsal/ventral streams in human auditory, somatosensory, and gustatory cortex.
4.2 Auditory field maps
Auditory stimuli are fundamentally spectrotemporal, meaning that complex sound waves have two fundamental components important for human perception: spectral information – such as which frequencies are present in the sound waves, and temporal information – such as when and for how long those frequencies are present (Shamma, 2001). Auditory field maps (AFMs), much like VFMs, are composed of two orthogonal dimensions representing each of these spectral and temporal components of sound (Barton et al., 2012; Herdener et al., 2013; Brewer and Barton, 2016b; Figure 11). It is important to note that these topographical spectral and temporal representations of AFMs are not associated with auditory spatial information; we do not yet know how auditory space—that is, where sounds are occurring around us—is encoded in human cortex after processing in the brainstem (Brewer and Barton, 2018). Thus we currently discuss spatial mapping for the visual and somatosensory systems, but frequency mapping of two types for audition.
Figure 11. Auditory field maps have been defined in the core and belt regions of human auditory cortex. (A) The 3-D inflated rendering of an individual left hemisphere lateral surface is shown with light gray gyri and dark gray sulci. This subject’s hA1 auditory field map (AFM) is labeled with the black dotted lines at the tip of Heschl’s gyrus (HG). The three colored ROIs on HG denote the locations of the cloverleaf clusters comprising the core and belt AFMs: Yellow, hCM/hCL cluster; Red, HG cluster including hA1, hR, hRM, hMM, hML, hAL; Magenta, hRTM/hRT/hRTL cluster (Barton et al., 2012; Brewer and Barton, 2016b). CS, central sulcus; STG, superior temporal gyrus; STS, superior temporal sulcus. CG, circular gyrus; PP, planum polare; PT, planum temporale. Anatomical-directions legend: S, superior; I, inferior; A, anterior; P, posterior. (B) Schematics depict the color code for the two orthogonal dimensions that are required to define an auditory field map: tonotopy (top), periodotopy (bottom). Diagrams in (C,D) use these colors for tonotopic and periodotopic representations, respectively. (C) A model of tonotopic representations of core and belt auditory field maps is overlaid on a schematic of a flattened region of cortex around HG. Dark gray indicates the plane of the lateral sulcus, while light gray indicates the circular gyrus (CG), Heschl’s gyrus (HG), and the superior temporal gyrus (STS; a/pSTS, anterior/posterior STS). White dotted lines denote the approximate boundaries between individual AFMs. AFM cloverleaf clusters are labeled to match those in (A). (D) A model of periodotopic representations of core and belt auditory field maps is overlaid on the same schematic of a flattened region of cortex around HG. Other details are as in (C). Upper right inset displays the approximate anatomical axes: M, medial; L, lateral; A, anterior; P, posterior.
For many years, only one spectrotemporal dimension—tonotopy (or “cochleotopy”)—was able to be measured in human cortex (for discussions, see Wessinger et al., 2001; Barton et al., 2012; Ress and Chandrasekaran, 2013; Brewer and Barton, 2016a; Chang et al., 2017). Tonotopy reflects the organization of the cochlea, which transduces complex sound waves into streams of neural signals representing the intensity of each frequency, analogous to a Fourier analysis (Moerel et al., 2012). Higher frequencies are transduced near the entrance of the cochlea, while continually decreasing frequencies are transduced further into the membrane, creating similar topographical gradients of frequency representation in human cortex that repeat with each AFM (Formisano et al., 2003; Humphries et al., 2010). More recently, it has been demonstrated that periodotopy, which represents the temporal information present in complex sound waves, known as periodicity, is the orthogonal counterpart to tonotopy that allows for the correct definition of AFMs according to principles consistent with the well-characterized VFMs of the visual system (Schreiner and Langner, 1988; Langner, 1992; Langner et al., 1997; Schulze et al., 2002; Baumann et al., 2011; Barton et al., 2012; Herdener et al., 2013). More specifically, periodotopic gradients in human cortex consist of neurons that code periodicity information by time-locking to the amplitude modulation of the sound wave (e.g., the length of time from one peak to the next of the temporal envelope; Shamma, 2001). Other neurons activated by periodotopic stimuli likely include neurons selective for the onset and offset of sound waves with varying refractory times as well as neurons that respond to differing sound-wave durations. Regardless of what aspect of periodicity the neurons are specifically encoding, periodotopic gradients are organized topographically along human cortex from lower modulation rates to higher ones. Similar measurements have also been observed in macaque monkey (Baumann et al., 2011).
Investigation into the types of computations performed by the currently known AFMs in human cortex has been limited both by the initial, incorrect use of tonotopic gradients alone to define AFM boundaries as well as little research to date into specific human AFM functions (Brewer and Barton, 2016b). At this time, the nomenclature used to name human AFMs and set up expectations for their functions is based on likely homologs to areas defined in non-human primate models (Barton et al., 2012). The presumed anatomical homologs between macaque and human and similar organization of tonotopic gradients provide converging evidence for the definition of matching AFMs in human cortex. Measurements in macaque define an auditory “core” consisting of three primary auditory areas: A1 (primary auditory cortex), R (rostral area), and RT (rostrotemporal area), differentiated from surrounding cortex based largely on the density of inputs from the thalamus to each area in the core as well as their more basic response characteristics (Merzenich and Brugge, 1973; Pandya and Sanides, 1973; Galaburda and Sanides, 1980; Galaburda and Pandya, 1983; Sweet et al., 2005; Dick et al., 2012; Brewer and Barton, 2016a). Lateral (CL, CM, AL, RTL) and medial (CM, RM, MM, RTM) “belt” regions surrounding the core are thought to be the next stages of auditory processing (Rauschecker et al., 1995; Rauschecker and Tian, 2004; Tian and Rauschecker, 2004; Kusmierek and Rauschecker, 2009). Finally, tertiary orders of the hierarchy consist of at least two further lateral “parabelt” regions, which have broad connections among various other auditory and multimodal regions of cortex (Kaas and Hackett, 2000; Kajikawa et al., 2015). While these macaque auditory regions lie along macaque superior temporal gyrus (STG), the human homologs are centered on Heschl’s gyrus (HG), which is rotated medially relative to human STG (Figure 11A; Leonard et al., 1998; Morosan et al., 2001; Barton et al., 2012; Dick et al., 2012). Thus the human names lack the directional implications of the original macaque ones. For example, while R is rostral to A1 in macaque and is thusly named, human R is lateral to human A1 due to the rotation of likely human homologs along HG relative to macaque STG (Barton et al., 2012; Brewer and Barton, 2016b). To account for these differences and the possibility that AFM functions differ between the species, the human auditory areas have been named with “h” before the macaque designation. Thus hA1 is the likely human homolog to macaque A1 and was identified in human as such through a combination of its location, smaller RF sizes, and internal cortical magnification that represents a wide span of frequencies.
The additional information contained in the periodotopic gradients, when combined with the tonotopic gradients to form complete AFMs, indicates that the core vs. belt macaque organizational model, while useful, is insufficient to fully describe the data observed in human cortex. Instead, AFMs appear to be organized into at least three cloverleaf clusters, similar to the organization found in VFMs of the visual system (Brewer and Barton, 2018; Figures 11C,D). Of these, one complete cluster has been measured, the HG cluster, which consists of hA1, hML, hAL, hR, hRM, and hMM, while hCM and hCL likely form part of another cluster and hRT, hRTL, and hRTM form part of a third likely cluster. Reversals in the periodotopic gradients divide the clusters into individual VFMs, while reversals in the tonotopic gradients divide each cluster from its neighbors. The discovery that AFMs are organized into cloverleaf clusters like VFMs indicates that cloverleaf clusters are a fundamental organizing principle of sensory cortex, likely to exist across sensory modalities (Brewer and Barton, 2018).
4.3 Somatotopic representations
Somatosensation is an overarching term for several subtypes of sensation, which include mechanoreception (e.g., vibration, discriminatory/fine touch, deep pressure), nociception (e.g., pain); thermoception (e.g., temperature); equilibrioception (e.g., balance); and proprioception (e.g., body position/movement; Lumpkin and Bautista, 2005; Eickhoff et al., 2006; Kaas, 2012; Pleger and Villringer, 2013). Somatosensory processing begins with peripheral receptors in the skin, organs, joints, and tissues that often have evolved highly specialized structures for optimizing their ability to detect changes in the environment, with receptor locations along the skin producing a somatotopic map of the body surface. These specialized responses then follow associated parallel processing pathways of somatosensory information through the spinal cord, to the brain stem, several nuclei of the thalamus, and ultimately to various cortical and other subcortical regions (Penfield and Boldrey, 1937; Krubitzer et al., 1995a; Kaas, 1997; Kaas, 2012; Saadon-Grosman et al., 2020a; Willoughby et al., 2020). In human and related animal models, these regions include the primary somatosensory cortex (S1 or “SI”) along the posterior bank of the central sulcus and the postcentral gyrus (Figure 12) and the secondary somatosensory area (S2 or “SII”) abutting the inferior part of S1 on the superior bank of the lateral fissure. Additional somatosensory representations have been measured in the superior and inferior parietal lobules, cingulate cortex, inferior frontal gyrus, and the frontal operculum (Ruben et al., 2001; Hagen et al., 2002; Young et al., 2004; Arienzo et al., 2006; Huang et al., 2012).
Figure 12. One dimension of human somatosensory field maps has been explored for tactile and pain representations for selected body parts in S1 and S2. (A) Hand diagram denotes the locations and order of fingertip stimulation by fine-touch (e.g., air puff or vibration) or nociceptive (e.g., radiant-heat laser) stimuli typically used by traveling-wave paradigms (e.g., Sanchez-Panchuelo et al., 2010; Mancini et al., 2012; Kolasinski et al., 2016a). Color legend is shown for stimulation of fingertips 1 to 5. (B) Schematic displays where measurement points have been tested for larger range somatotopy on face, fingertips, and foot in Sanchez Panchuelo et al. (2018). Note that the color scheme now represents different body locations than in (A). (C) Body diagram shows points of somatotopic measurement from head to foot from Willoughby et al. (2020). The color-grouping for head, finger/hand, and leg/foot representations loosely matches the color scheme from (B). (D) A color-coded schematic of cortical responses to the fingertip stimuli from (A) is overlaid on a 3-D inflated left hemisphere to display the approximate location of fingertip somatotopy in S1. These fingertip representations are estimated from individual-subject and group-averaged responses to both the light touch and pain stimuli from Sanchez-Panchuelo et al. (2010) and Mancini et al. (2012). (E) A color-coded schematic of cortical responses to the head-hand-foot somatotopic measurements from (B,C) is overlaid on the same 3-D inflated left hemisphere to display the approximate location of these somatotopic representations in both S1 and likely S2. These coarse somatotopic representations were estimated from individual-subject data shown in Sanchez Panchuelo et al. (2018) and Willoughby et al. (2020). White dotted lines approximate the locations of human primary somatosensory cortex (S1) and secondary somatosensory cortex (S2). CS, central sulcus; LS, lateral sulcus; STG, superior temporal gyrus; STS, superior temporal sulcus. Light gray, gyri; dark gray, sulci. Anatomical-directions legend: S, superior; I, inferior; A, anterior; P, posterior.
Investigations of somatotopic representations in human cortex to date have focused primarily on a few types of mechanosensation and nociception (Sanchez-Panchuelo et al., 2010; Mancini et al., 2012; Sanchez Panchuelo et al., 2018; Willoughby et al., 2020). Similar to the retinal information in the visual system, somatosensory information about tactile and pain stimuli from the surface of the skin arises from a relatively easy-to-conceptualize, 2D space along the body (Rothschild and Mizrahi, 2015). In contrast to the visual system, however, systematically stimulating significant regions of this skin space to map out the associated cortical sensory topography in humans is much more difficult experimentally (Disbrow et al., 2000; Ruben et al., 2001; Sanchez-Panchuelo et al., 2010; Mancini et al., 2012; Besle et al., 2013; Willoughby et al., 2020). For example, devices that produce high-resolution, light-touch stimulation of large skin regions are difficult to create and even more difficult to adapt to an MR-scanning environment. In addition, the cortical anatomy along the central sulcus presented significant problems for accurate neuroimaging measurements for many years. S1 lies along the postcentral gyrus just millimeters from a similar somatotopic map in primary motor cortex (M1) on the closely abutting precentral gyrus (Disbrow et al., 2000; Ruben et al., 2001; Eickhoff et al., 2007). With the juxtaposition of these two similar topographies across the central sulcus, typical traveling-wave CFM measurements at lower magnetic-field strengths have generally been insufficient to precisely resolve the somatotopic organization along each gyrus (Besle et al., 2013). The spatial resolution of most of these fMRI measurements was not high enough until recently to overcome the partial-volume effects of a single voxel combining measurements of neuronal responses from S1 and M1 into one data point (Gonzalez Ballester et al., 2002; Sanchez-Panchuelo et al., 2010; Willoughby et al., 2020). In addition, somatosensory maps in both adult human and other mammals appear to be part of a rather dynamic system that can undergo significant alterations across much of the lifespan, in contrast to the visual system, in which the cortical plasticity of VFMs is greatly reduced after the close of the critical period of visual development (e.g., Kaas et al., 1983; Kaas, 1991; Smirnakis et al., 2005; Jain et al., 2008; Wandell and Smirnakis, 2009; Brewer and Barton, 2014; Qi et al., 2014; Barton and Brewer, 2015; Kolasinski et al., 2016b). Finally, relatively high variability of the topographies in this area across human subjects caused significant issues for the many fMRI studies that employed group-averaging to the measurements, which compounded the partial volume effects with additional blurring of the data from the averaging (Duncan and Boynton, 2007; Sanchez-Panchuelo et al., 2010; Brewer and Barton, 2012; Besle et al., 2014; Sánchez-Panchuelo et al., 2014; Kolasinski et al., 2016a).
Because of these issues, a significant amount of our knowledge from the last century regarding the localization of functions in the human somatosensory cortex has relied on lesion studies and intra-operative neuronal recording and stimulation measurements in human patients together with examinations of S1, S2, and related areas in various animal models (e.g., Penfield and Boldrey, 1937; Woolsey et al., 1979; Krubitzer et al., 1995a,b; Roux et al., 2018; Saadon-Grosman et al., 2020b). Penfield and colleagues’ intraoperative experiments on humans in the 1930s have served since then as the foundation for our current expectations of S1 and S2 organization in human, despite early concerns about reproducibility those same researchers raised regarding the concept of their cortical homunculus (Penfield and Boldrey, 1937; Snyder and Whitaker, 2013; Saadon-Grosman et al., 2020b; Willoughby et al., 2020). Researchers have also questioned what differences may arise for cortical responses activated by such direct stimulation of S1 with electrodes in cortex, which bypasses the peripheral nerves, in contrast to those from the normal physiological stimulation of S1 through the peripheral receptors in the skin.
Measurements subsequent to the Penfield studies have generally supported the idea of the somatotopic homunculus running medial-laterally—one topographical dimension—in human cortex, but it was just recently that research began to make progress at measuring the details of the two-dimensional topography and cortical magnification of specific body-part representations (e.g., face and hands) within S1 and S2 in higher resolution (Penfield and Boldrey, 1937; Roux et al., 2018; Schellekens et al., 2018; Willoughby et al., 2020). In the 1990s and early 2000s, several studies using both fMRI and neuromagnetic methodologies began to map out the two-dimensional representations of skin regions along the palm and/or fingertips. Despite the relatively lower spatial resolution measurements from the available technology at the time, these researchers were able to demonstrate rostal-caudal (or proximal-distal) topographical gradients for light touch, vibrations, and innocuous electrical stimulation in addition to the medial-lateral gradients of the homunculus in human S1 (Hari et al., 1993; Gelnar et al., 1998; Hashimoto et al., 1999; Francis et al., 2000; Kurth et al., 2000; Deuchert et al., 2002; Blankenburg et al., 2003). Importantly, Blankenburg et al. (2003) defined rostal-caudal gradients along the finger and palm representation that included a mirror reversal of the somatotopic gradients at the fingertip representations from Brodmann’s area 3b to area 1, two cytoarchitectural subdivisions of S1 that display preferential responses to the stimulation of cutaneous receptors (Powell and Mountcastle, 1959; Iwamura et al., 1993). As in the visual system, such gradient reversals should reflect multiple representations of the skin topography, perhaps with each map dedicated to different tactile modalities or levels of complexity of somatosensory processing (Iwamura et al., 1993; Friedman et al., 2004). Exactly how these representations form complete somatosensory field maps (SFMs) integrating the multiple tactile and nociceptive modalities remains to be seen, with larger scale mapping of the skin surface likely necessary to resolve these questions.
The recent high-magnetic-field and high-resolution (at 3 T and 7 T) fMRI experiments measuring S1 and S2 have made excellent progress at beginning to map out detailed regions of somatotopic organization in human non-invasively and using peripheral—and thus potentially more natural—somatosensory stimulation (e.g., vibrotactile, pneumatic, nociceptive; Sanchez-Panchuelo et al., 2010; Mancini et al., 2012; Sánchez-Panchuelo et al., 2014; Sanchez Panchuelo et al., 2018; Saadon-Grosman et al., 2020b; Willoughby et al., 2020). Converging evidence from several independent labs has demonstrated a pattern of responses for leg/foot, finger/hand, and head representations for each area that mostly matches our expectations from the prior work (Figure 12B). The most detailed examinations have been of the fingertips, with evidence for overlapping topographies for vibrotactile, pneumatic, and pain stimulation in S1 (Sanchez-Panchuelo et al., 2010; Mancini et al., 2012). Interestingly, there are suggestions that the topographies for pain and tactile inputs differ somewhat in their cortical magnification despite the overlapping location (Mancini et al., 2012). In addition, the size of the topographies of the fingertips has been correlated with tactile acuity (Duncan and Boynton, 2007). There is some suggestion that differences may exist between these maps and those initially proposed by Penfield, but the general pattern of the homunculus is largely consistent, and it is difficult to fully compare the two measurement types with the necessarily limited body-coverage of stimulation in the current studies (Saadon-Grosman et al., 2020b; Willoughby et al., 2020). Much as we see reversals from one VFM or AFM to the next, the regions of S1 abutting S2 appears to occur in these measurements at a representation of the face/head. Finally, evidence is emerging that the somatosensory system is also loosely organized around the same dorsal/ventral perceptual streams as we see in vision and audition/language (Goodale and Milner, 1992; Hickok and Poeppel, 2007; Saadon-Grosman et al., 2020a).
It is important to remember that, in contrast to visual and auditory CFMs, somatosensory representations in human cortex have only been measured with fMRI to date in these experiments as a few discrete points across the body (Figure 12A). The discrete measurement points produce a one-dimensional gradient running head to foot and/or across the tips of the fingers, rather than a map of the 2D space of the entire skin, so these regions cannot yet be termed somatosensory field maps (SFMs). A full SFM will require larger-scale mapping of two body axes, which could be defined as any paired combination of superior–inferior (rostral-caudal), anterior–posterior, medial-lateral, or distal-proximal gradients. The use of discrete points also necessarily means that determinations of differences in cortical magnification across the somatotopic map cannot yet be precisely measured (Engel et al., 1997; Brewer and Barton, 2016b). With discrete, non-abutting measurement points in a traveling-wave paradigm, the spread of the cortical response to neighboring regions that are slightly—but not preferentially—activated by the stimulation point produces an enlarged map of that region (Engel et al., 1997; Brewer et al., 2005; Wandell et al., 2007; Brewer and Barton, 2012). As our measurements advance, it will be exciting to delineate exactly how cortical magnification differs among body regions and across individuals as well as whether the putative SFMs are also arranged into a macrostructural pattern of cloverleaf clusters.
4.4 The chemical senses: topographies in gustation and olfaction
Only recently has research begun to unravel the cortical representations of the chemical senses, gustation (taste) and olfaction (smell). In the case of vision, audition, and somatosensation, there is a general understanding of which values of the sensory stimulus should be represented in a contiguous topographical cortical representation. In contrast, the mapping of the chemical senses presents a more challenging task, as the stimuli are composed of molecules that exhibit a wide range of diversity across such factors as size, charge distribution, bond saturation, functional groups, and three-dimensional structure (e.g., Chen et al., 2011; Lundstrom et al., 2011; Murthy, 2011; Auffarth, 2013; Miskovic and Anderson, 2018). Because a single, small molecule can be characterized by numerous parameters, it is nearly impossible to systematically map even a fraction of these parameters onto a 2D surface without some knowledge of what privileged parameters might have been selectively favored during evolution to serve as a fundamental organizing principle. In addition, such chemotopic organization may be more likely to produce a discrete map that combines similar receptor or molecular inputs onto target neurons rather than the continuous topographical gradients we expect for the other CFMs (Figure 13; Luo and Flanagan, 2007; Murthy, 2011; Francia and Lodovichi, 2021).
Figure 13. Olfactory topography may compose a discrete map vs. a continuous one. (A) The schematic shows an example of the continuous retinotopic map of the visual system. The diagram of the left eye is fixated on the asterisk in the red circle of the rainbow in the right visual hemifield. The light from the rainbow travels across the eye to be absorbed by the photoreceptors of the opposite, temporal side of the retina (T, temporal; N, nasal). The asterisk on the retina represents the central fovea. Neighboring points in the visual field thus activate neighboring points on the retina of the stabilized eye. The retinal ganglion cells at each retinal location maintain this retinotopic organization through their axons that project to the thalamus and then, after synapsing, to V1. The rainbow eccentricity pattern is shown from visual space, to the retina, through the axons of the optic nerve and tract, and to the colored data overlay that demonstrates the continuous eccentricity map along the medial wall of the inflated left hemisphere. Other details are as in Figure 10. (B) This diagram displays a schematic of a discrete olfactory map. The black square (left) represents a region of olfactory epithelium, with each color drop denoting an olfactory receptor neuron expressing a specific receptor type. Rather than projecting to a continuous map of stimulus dimensions, the olfactory sensory neurons expressing the same odorant receptor converge to form glomeruli in specific locations of the olfactory bulb (right). Thus the discrete topographic organization in this sensory system is based on the type rather than the spatial distribution of the sensory inputs. Rainbow colors represent a subset of specific odorant receptor types, while gray circles represent additional odorant receptor types not depicted with connections in this diagram.
While the quality of such chemical stimuli does not inherently provide a topographical criterion to predict which region of the cortex would preferentially represent a specific tastant or olfactant, it is still possible that a spatially segregated and ordered cortical representation of these molecules’ qualities may arise from genetically predetermined neural circuits in specific regions, such as those contributing to innate ecological behaviors (Luo and Flanagan, 2007; Sosulski et al., 2011; Chikazoe et al., 2019; Olofsson and Freiherr, 2019). For example, taste receptors are finely tuned to recognize specific taste types associated with distinct hedonic values and thus play a vital role in guiding food selection through reward and punishment (Rolls, 2011; Berridge and Kringelbach, 2015). Sweet receptors facilitate the identification of energy-rich nutrients like glucose, while bitter receptors are thought to serve as protection against potentially harmful substances, forming the basis of oral aversion and disgust (Accolla et al., 2007; Peng et al., 2015). Because the ability to identify food as safe to eat by taste or smell is crucial for the survival of living organisms, aspects of these senses tend to be highly conserved across species. Furthermore, their cortical processing may fine-tune each taste or smell domain to identify particular nutrients, toxic substances, chemicals associated with physiological functions, and/or hedonic attributes (Lundstrom et al., 2011; Miskovic and Anderson, 2018).
4.4.1 Gustatopic representations
The mammalian tongue possesses specialized receptors that are attuned to fundamental taste categories. In humans, five primary tastes are perceived: sweet, sour, umami, bitter, and salt. There are additional potential basic tastes, such as CO2, fat, water, pungency (e.g., spiciness or hotness), coolness, calcium, or metallicness (Chaudhari and Roper, 2010; Crouzet et al., 2015). Interestingly, CO2 contributes to the taste of carbonation through a dedicated taste-receptor mechanism (Chandrashekar et al., 2009). Such taste perception originates in the tongue from taste buds located on the circumvallate, foliate, and fungiform gustatory papillae (Roper, 2013). Taste information in mammals then follows multiple ascending pathways in the brainstem and ultimately activates primary taste cortex thought to be in the insular cortex, with potential secondary taste areas in the operculum (Sugita and Shiba, 2005; Kaas et al., 2006; Rolls, 2006; Chen et al., 2011; Stevenson et al., 2013). There are two primary conflicting theories of how taste information is encoded and transmitted to cortical gustatory processing (Mueller et al., 2005; Breslin and Huang, 2006; Huang et al., 2006; Chandrashekar et al., 2010). In the labeled-line model, information about a single taste type is encoded by a dedicated set of receptor cells specifically tuned for that taste. This single-taste information is then conveyed to gustatory cortex through taste-specific afferent fibers. In contrast, the across-fiber-pattern model proposes that taste information is communicated across multiple afferent fibers coding taste-type information via population codes of spatiotemporal patterns.
While the primary gustatory cortex in other animals has been shown to distinctly represent these taste categories, identifying them in humans was challenging for many years (Sugita and Shiba, 2005; Accolla et al., 2007; Chen et al., 2011, 2021). Over the last decade, gustatory stimuli in human have been shown to activate various cortical areas, including the insula, frontal operculum, parietal operculum, and orbitofrontal cortex. In addition, other measurements have demonstrated that the human insula represents at least two interrelated gustatory parameters: taste qualities and their palatability (i.e., trial-by-trial hedonic responses; Chikazoe et al., 2014; Crouzet et al., 2015). The insula also receives sensory inputs from visceral organs, including information about gastric distension, temperature, and pain, which may overlap with inputs originating from the chemosensory receptors in the tongue and oral cavity, to generate a comprehensive interoceptive system (Craig, 2009). The integration of these separate gustatory inputs into a comprehensive cortical representation, along with essential inputs from the other sensory modalities, could facilitate our intricate experiences of palatability and higher-order flavor perception (Yeshurun and Sobel, 2010a).
Very recent updates from a handful of studies have measured topographical representations of basic taste stimuli at the putative location of a primary gustatory cortex in human along the insular cortex and adjacent operculum (Prinster et al., 2017; Chikazoe et al., 2019). These studies utilized high-field (7 T) fMRI to delineate the some or all of the five key tastes plus the perception of CO2 using both traveling-wave and multivoxel pattern analysis to examine the various taste responses (Figure 14A). Although the exact pattern of activation of the insula by specific taste categories appears to be somewhat variable across subjects, these studies do demonstrate a consistent topographical mapping of gustatory information in this region (Figure 14B). This is a very exciting finding that supports CFMs as a fundamental organizing principle in the chemical senses as well. It is important to note, however, that these gustatory measurements span a broad area of the cortical sheet, likely larger than the surface area of V1, with broad and scattered regions of specific basic tastes. Whether this entire region contains just one gustatory field map (GFM) or, more likely, multiple GFMs each subserving unique gustatory computations remains to be seen. In addition, these representations of specific taste types compose one dimension of purported GFMs. A second dimension could arise from a range of parameters, from low-level properties like molecular parameters (other than those contributing to the first basic-taste dimension) or taste intensity or higher-level properties such as palatability/hedonistic value.
Figure 14. One dimension of gustatory field maps has been identified for taste representations in human insular cortex. (A) The schematic displays an example organization of one taste-and-rinse gustatory stimulus cycle, based on the gustatory mapping paradigm presented in Prinster et al. (2017). The stimulus begins with an auditory cue (black speaker) that a taste-test solution (orange) is about to be delivered. The tastant is delivered via an injection for 2 s (solid orange bar) and then held in the mouth for tasting for an additional 10 s (orange striped bar). Full tasting time (open mouth icon) includes injection time and time the solution is held in the mouth (12 s in this example). A second auditory cue (brown speaker) next signals the start of the swallowing period (green bar), which is followed by a longer rest period (closed mouth icon). After each test solution, water (gray) is used in the same paradigm for rinsing between taste tests. The cycle then repeats with the next tastant. FMRI data acquisition is continuous, with a TR = 2 s in this example. (B) Colored ROIs denote estimated locations for selective cortical responses based on group-averaged data from Prinster et al. (2017) for the six primary receptor-mediated tastants: sour (red), sweet (yellow), salty (green), bitter (cyan), CO2 (navy), and umami (magenta). ROIs are overlaid on a 3-D inflated rendering of a left hemisphere, with gyri marked in light gray and sulci in dark gray. Even with some overlap of tastant responses, a topographical organization of these six principal tastes can be seen. CS, central sulcus; LS, lateral sulcus; STG, superior temporal gyrus; STS, superior temporal sulcus. Anatomical-directions legend: S, superior; I, inferior; A, anterior; P, posterior.
4.4.2 Olfactory topographies
The perception of odors similarly begins in mammals with the recognition of odorant molecules by a diverse set of approximately 1,000 different odorant receptor types (Vassar et al., 1994; Fleischer et al., 2009; Yeshurun and Sobel, 2010b). These seven-transmembrane receptors are expressed on the olfactory sensory neurons (OSN; also known as olfactory receptor neurons) in the olfactory sensory epithelium (OE) along the posterior/superior aspect of the nasal sinuses (Buck and Axel, 1991). Each OSN expresses a single olfactory receptor gene, and each gene has its own contiguous expression area within the OE that overlaps with those of its neighbors (Vassar et al., 1993; Miyamichi et al., 2005). Each olfactory receptor not only can interact with a diverse set of odorants, but also demonstrates high specificity for its specific olfactants. Thus, subtle alterations in the structure of an odorant molecule can often cause major changes in the perceived odor. In fact, humans are astonishingly adept at olfactory discrimination, with one study suggesting that they can distinguish more than one trillion olfactory stimuli (Ressler et al., 1994; Bushdid et al., 2014). This sensory prowess likely reflects the vast options for combined outputs from ∼400 different subtypes of olfactory receptors (Gilad and Lancet, 2003; Auffarth, 2013). Since the genome of olfactory receptor subtypes varies by about 30% across individuals, each person’s olfactory epithelium is composed of a potentially entirely different set of olfactory receptor genes that may even produce unique olfactory perception in each individual (Menashe et al., 2003; Keller et al., 2007; Secundo et al., 2015).
OSNs are bipolar cells that express their olfactory receptors on their apical dendrites in the nasal cavity and project their axons through the cribriform plate of the skull to synapse with the mitral and tufted cells of the olfactory bulbs. OSNs with the same olfactory receptor synapse onto a single specific glomeruli within an olfactory bulb (Figure 13B; Mori et al., 1999; Imai et al., 2010; Murthy, 2011; Francia and Lodovichi, 2021). This convergence of OSN projections with matching receptor types in the glomeruli is facilitated by the expression of the same olfactory receptors at both the OSN’s dendrites in the OE and their synapses in the olfactory bulbs (Mori et al., 1999; Cho et al., 2009; Lodovichi, 2020). A discrete sensory map is thus formed here around the identity of the odorant receptors (Mombaerts et al., 1996; Luo and Flanagan, 2007). Ma et al. suggest that the organization of the glomeruli in mice is then based on loosely grouping together glomeruli tuned to specific molecular properties—such as esters, ketones, etc.—with similar tuning properties (Ma et al., 2012). Such a “tunotopic” map may aid odor discrimination by enhancing the contrast among similar odors. Ultimately, each odorant produces a unique pattern of activity across the glomeruli in the olfactory bulbs that is stable over at least several months (Kato et al., 2012). While the majority of studies of these topics rely on various animal models, these stages of olfactory processing appear to be highly conserved across mammals including humans (Zelano and Sobel, 2005; Echevarria-Cooper et al., 2022).
Despite our understanding of many aspects of this early olfactory processing, there has been a lot of difficulty with determining what organization may be present within the next steps of olfactory processing in the mammalian piriform cortex. For many years, piriform cortex was generally thought to lack topographical organization (Murthy, 2011; Sosulski et al., 2011). Without clear evidence for such topography, several studies have proposed that these neural computations rely on experience-based plasticity across the lifespan to develop and update the necessary olfactory processing circuits (e.g., Babadi and Sompolinsky, 2014; Schaffer et al., 2018; Hiratani and Latham, 2020; Schoonover et al., 2021). However, a very recent study in mice has utilized cutting-edge neuroanatomical techniques to map the brain-wide projections among thousands of individual neurons in the olfactory bulb and piriform cortex, a much larger sample than prior work was able to achieve (Chen et al., 2022). Their results suggest that the olfactory cortex connectivity is in fact spatially structured. An olfactory bulb neuron (i.e., mitral cell) projects both to a particular location along the anterior–posterior axis of piriform cortex and to matched and functionally distinct cortical targets outside of the piriform. In addition, single neurons from the piriform project to the same extra-piriform targets that their matched olfactory bulbs neurons project to. This triadic circuit organization that routes olfactory information to functionally distinct regions of cortex is quite compelling, as it positions olfaction as having a similar framework for coordinated, parallel processing pathways of sensory information as we see in the other sensory systems (Chon et al., 2020; Imamura et al., 2020; Francia and Lodovichi, 2021; Chen et al., 2022). It remains to be seen whether such olfactory representations also compose CFMs, cloverleaf clusters or dorsal/ventral processing streams and whether these measurements in mice are applicable to humans as we expect. Based upon the current findings in gustatory cortex, one avenue to investigate potential olfactory field maps (OFMs) would be to search for topographical representations of olfactant molecular properties, concentration/intensity, or palatability.
5 Discussion
Common schemes of topographical organization are thus emerging across human sensory systems. Visual and auditory cortices are compartmentalized into CFMs that are themselves arranged on a larger scale into cloverleaf clusters. This fundamental organization likely provides a structure for the complex processing and analysis of inputs from their peripheral sensory receptors. Somatosensory cortex in human shares similar parallel processing pathways as those two senses as well as a loose division into dorsal and ventral streams. Ongoing investigations into the details of the somatotopic maps in human S1 and S2 are beginning to reveal the details of the SFMs in humans. Despite the differences in the discrete properties of the molecular stimuli in the chemical senses, recent studies in human and animal models demonstrate that gustation and olfaction may utilize similar topography as well. Knowledge of how these topographical representations are organized across cortex provides us with insight into how our conscious perceptions are created from our basic sensory inputs. The detailed examination of these CFMs and clusters in individual humans and across species can be applied to the careful analysis of the computational stages of sensory. In addition, studying how these representations change during development, trauma, and disease can serves as an important tool for developing improvements in clinical therapies and rehabilitation for sensory deficits.
Author contributions
AB and BB conceived of the concept and developed the discussion. AB wrote the manuscript. BB revised it. All authors contributed to the article and approved the submitted version.
Funding
This work was supported in part by research grant #1329255 from the National Science Foundation Cognitive Sciences Program and by startup funds from the Department of Cognitive Sciences at the University of California, Irvine.
Conflict of interest
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Publisher’s note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
References
Accolla, R., Bathellier, B., Petersen, C. C., and Carleton, A. (2007). Differential spatial representation of taste modalities in the rat gustatory cortex. J. Neurosci. 27, 1396–1404. doi: 10.1523/JNEUROSCI.5188-06.2007
Adams, D. L., Sincich, L. C., and Horton, J. C. (2007). Complete pattern of ocular dominance columns in human primary visual cortex. J. Neurosci. 27, 10391–10403. doi: 10.1523/JNEUROSCI.2923-07.2007
Amano, K., Wandell, B. A., and Dumoulin, S. O. (2009). Visual field maps, population receptive field sizes, and visual field coverage in the human MT+ complex. J. Neurophysiol. 102, 2704–2718. doi: 10.1152/jn.00102.2009
Arcaro, M. J., McMains, S. A., Singer, B. D., and Kastner, S. (2009). Retinotopic organization of human ventral visual cortex. J. Neurosci. 29, 10638–10652. doi: 10.1523/JNEUROSCI.2807-09.2009
Arienzo, D., Babiloni, C., Ferretti, A., Caulo, M., del Gratta, C., Tartaro, A., et al. (2006). Somatotopy of anterior cingulate cortex (ACC) and supplementary motor area (SMA) for electric stimulation of the median and tibial nerves: an fMRI study. Neuroimage 33, 700–705. doi: 10.1016/j.neuroimage.2006.06.030
Auffarth, B. (2013). Understanding smell--the olfactory stimulus problem. Neurosci. Biobehav. Rev. 37, 1667–1679. doi: 10.1016/j.neubiorev.2013.06.009
Babadi, B., and Sompolinsky, H. (2014). Sparseness and expansion in sensory representations. Neuron 83, 1213–1226. doi: 10.1016/j.neuron.2014.07.035
Bandettini, P. A., Jesmanowicz, A., Van Kylen, J., Birn, R. M., and Hyde, J. S. (1998). Functional MRI of brain activation induced by scanner acoustic noise. Magn. Reson. Med. 39, 410–416. doi: 10.1002/mrm.1910390311
Bartels, A., and Zeki, S. (2000). The architecture of the colour Centre in the human visual brain: new results and a review. Eur. J. Neurosci. 12, 172–193. doi: 10.1046/j.1460-9568.2000.00905.x
Bartfeld, E., and Grinvald, A. (1992). Relationships between orientation-preference pinwheels, cytochrome oxidase blobs, and ocular-dominance columns in primate striate cortex. Proc. Natl. Acad. Sci. U. S. A. 89, 11905–11909. doi: 10.1073/pnas.89.24.11905
Barton, B., and Brewer, A. A. (2015). fMRI of the rod scotoma elucidates cortical rod pathways and implications for lesion measurements. Proc. Natl. Acad. Sci. U. S. A. 112, 5201–5206. doi: 10.1073/pnas.1423673112
Barton, B., and Brewer, A. A. (2017). Visual field map clusters in high-order visual processing: organization of V3A/V3B and a new cloverleaf cluster in the posterior superior temporal sulcus. Front. Integr. Neurosci. 11:4. doi: 10.3389/fnint.2017.00004
Barton, B., Venezia, J. H., Saberi, K., Hickok, G., and Brewer, A. A. (2012). Orthogonal acoustic dimensions define auditory field maps in human cortex. Proc. Natl. Acad. Sci. U. S. A. 109, 20738–20743. doi: 10.1073/pnas.1213381109
Baseler, H. A., Brewer, A. A., Sharpe, L. T., Morland, A. B., Jagle, H., and Wandell, B. A. (2002). Reorganization of human cortical maps caused by inherited photoreceptor abnormalities. Nat. Neurosci. 5, 364–370. doi: 10.1038/nn817
Baseler, H. A., Gouws, A., Haak, K. V., Racey, C., Crossland, M. D., Tufail, A., et al. (2011). Large-scale remapping of visual cortex is absent in adult humans with macular degeneration. Nat. Neurosci. 14, 649–655. doi: 10.1038/nn.2793
Baumann, S., Griffiths, T. D., Sun, L., Petkov, C. I., Thiele, A., and Rees, A. (2011). Orthogonal representation of sound dimensions in the primate midbrain. Nat. Neurosci. 14, 423–425. doi: 10.1038/nn.2771
Baumann, S., Joly, O., Rees, A., Petkov, C. I., Sun, L., Thiele, A., et al. (2015). The topography of frequency and time representation in primate auditory cortices. Elife 4:3256. doi: 10.7554/eLife.03256
Benson, N. C., Butt, O. H., Datta, R., Radoeva, P. D., Brainard, D. H., and Aguirre, G. K. (2012). The retinotopic organization of striate cortex is well predicted by surface topology. Curr. Biol. 22, 2081–2085. doi: 10.1016/j.cub.2012.09.014
Benson, N. C., Jamison, K. W., Arcaro, M. J., Vu, A. T., Glasser, M. F., Coalson, T. S., et al. (2018). The human connectome project 7 tesla retinotopy dataset: description and population receptive field analysis. J. Vis. 18:23. doi: 10.1167/18.13.23
Berridge, K. C., and Kringelbach, M. L. (2015). Pleasure systems in the brain. Neuron 86, 646–664. doi: 10.1016/j.neuron.2015.02.018
Besle, J., Sanchez-Panchuelo, R. M., Bowtell, R., Francis, S., and Schluppeck, D. (2013). Single-subject fMRI mapping at 7 T of the representation of fingertips in S1: a comparison of event-related and phase-encoding designs. J. Neurophysiol. 109, 2293–2305. doi: 10.1152/jn.00499.2012
Besle, J., Sanchez-Panchuelo, R. M., Bowtell, R., Francis, S., and Schluppeck, D. (2014). Event-related fMRI at 7T reveals overlapping cortical representations for adjacent fingertips in S1 of individual subjects. Hum. Brain Mapp. 35, 2027–2043. doi: 10.1002/hbm.22310
Binda, P., Thomas, J. M., Boynton, G. M., and Fine, I. (2013). Minimizing biases in estimating the reorganization of human visual areas with BOLD retinotopic mapping. J. Vis. 13:13. doi: 10.1167/13.7.13
Blankenburg, F., Ruben, J., Meyer, R., Schwiemann, J., and Villringer, A. (2003). Evidence for a rostral-to-caudal somatotopic organization in human primary somatosensory cortex with mirror-reversal in areas 3b and 1. Cereb. Cortex 13, 987–993. doi: 10.1093/cercor/13.9.987
Bonhoeffer, T., and Grinvald, A. (1991). Iso-orientation domains in cat visual cortex are arranged in pinwheel-like patterns. Nature 353, 429–431. doi: 10.1038/353429a0
Boussaoud, D., Desimone, R., and Ungerleider, L. G. (1991). Visual topography of area TEO in the macaque. J. Comp. Neurol. 306, 554–575. doi: 10.1002/cne.903060403
Breslin, P. A. S., and Huang, L. (2006). Human taste: peripheral anatomy, taste transduction, and coding. Adv Otorhinolaryngol. 63, 152–190. doi: 10.1159/000093760
Brewer, A. A., and Barton, B. (2012). “Visual field map organization in human visual cortex” in Visual cortex—current status and perspectives. eds. S. Molotchnikoff and J. Rouat (Rijeka, Croatia: InTech), 29–60.
Brewer, A. A., and Barton, B. (2014). “Developmental plasticity: FMRI investigations into human visual cortex” in Advanced brain neuroimaging topics in health and disease—methods and applications. eds. T. D. Papageorgiou, G. Christopoulos, and S. Smirnakis (Rijeka, Croatia: InTech), 305–334.
Brewer, A. A., and Barton, B. (2016a). “Human auditory cortex” in Neurobiology of language. eds. G. Hickok and S. L. Small (Cambridge: Academic Press, Elsevier), 49–58.
Brewer, A. A., and Barton, B. (2016b). Maps of the auditory cortex. Annu. Rev. Neurosci. 39, 385–407. doi: 10.1146/annurev-neuro-070815-014045
Brewer, A. A., and Barton, B. (2018). “Cloverleaf clusters: a common Macrostructural Organization across human visual and auditory cortex” in Sensory nervous system. ed. T. Heinbockel (London, UK: IntechOpen), 127–160.
Brewer, A. A., Liu, J., Wade, A. R., and Wandell, B. A. (2005). Visual field maps and stimulus selectivity in human ventral occipital cortex. Nat. Neurosci. 8, 1102–1109. doi: 10.1038/nn1507
Brewer, A. A., Press, W. A., Logothetis, N. K., and Wandell, B. A. (2002). Visual areas in macaque cortex measured using functional magnetic resonance imaging. J. Neurosci. 22, 10416–10426. doi: 10.1523/JNEUROSCI.22-23-10416.2002
Buck, L., and Axel, R. (1991). A novel multigene family may encode odorant receptors: a molecular basis for odor recognition. Cells 65, 175–187. doi: 10.1016/0092-8674(91)90418-x
Bushdid, C., Magnasco, M. O., Vosshall, L. B., and Keller, A. (2014). Humans can discriminate more than 1 trillion olfactory stimuli. Science 343, 1370–1372. doi: 10.1126/science.1249168
Chandrashekar, J., Kuhn, C., Oka, Y., Yarmolinsky, D. A., Hummler, E., Ryba, N. J., et al. (2010). The cells and peripheral representation of sodium taste in mice. Nature 464, 297–301. doi: 10.1038/nature08783
Chandrashekar, J., Yarmolinsky, D., von Buchholtz, L., Oka, Y., Sly, W., Ryba, N. J., et al. (2009). The taste of carbonation. Science 326, 443–445. doi: 10.1126/science.1174601
Chang, K. H., Thomas, J. M., Boynton, G. M., and Fine, I. (2017). Reconstructing tone sequences from functional magnetic resonance imaging blood-oxygen level dependent responses within human primary auditory cortex. Front. Psychol. 8:1983. doi: 10.3389/fpsyg.2017.01983
Chaudhari, N., and Roper, S. D. (2010). The cell biology of taste. J. Cell Biol. 190, 285–296. doi: 10.1083/jcb.201003144
Chen, Y., Chen, X., Baserdem, B., Zhan, H., Li, Y., Davis, M. B., et al. (2022). High-throughput sequencing of single neuron projections reveals spatial organization in the olfactory cortex. Cells 185, 4117–4134.e28. doi: 10.1016/j.cell.2022.09.038
Chen, X., Gabitto, M., Peng, Y., Ryba, N. J., and Zuker, C. S. (2011). A gustotopic map of taste qualities in the mammalian brain. Science 333, 1262–1266. doi: 10.1126/science.1204076
Chen, K., Kogan, J. F., and Fontanini, A. (2021). Spatially distributed representation of taste quality in the gustatory insular cortex of behaving mice. Curr. Biol. 31, 247–256.e4. doi: 10.1016/j.cub.2020.10.014
Chikazoe, J., Lee, D. H., Kriegeskorte, N., and Anderson, A. K. (2014). Population coding of affect across stimuli, modalities and individuals. Nat. Neurosci. 17, 1114–1122. doi: 10.1038/nn.3749
Chikazoe, J., Lee, D. H., Kriegeskorte, N., and Anderson, A. K. (2019). Distinct representations of basic taste qualities in human gustatory cortex. Nat. Commun. 10:1048. doi: 10.1038/s41467-019-08857-z
Chklovskii, D. B., and Koulakov, A. A. (2004). Maps in the brain: what can we learn from them? Annu. Rev. Neurosci. 27, 369–392. doi: 10.1146/annurev.neuro.27.070203.144226
Cho, J. H., Prince, J. E., and Cloutier, J. F. (2009). Axon guidance events in the wiring of the mammalian olfactory system. Mol. Neurobiol. 39, 1–9. doi: 10.1007/s12035-008-8047-7
Chon, U., LaFever, B. J., Nguyen, U., Kim, Y., and Imamura, F. (2020). Topographically distinct projection patterns of early-generated and late-generated projection neurons in the mouse olfactory bulb. eNeuro 7, ENEURO.0369–ENEU20.2020. doi: 10.1523/ENEURO.0369-20.2020
Clarke, S., and Morosan, P. (2012). “Architecture, connectivity, and transmitter receptors of human auditory cortex” in The human auditory cortex. eds. D. Poeppel, T. Overath, A. Popper, and R. Richard (New York: Springer), 11–38.
Collins, D. L., Neelin, P., Peters, T. M., and Evans, A. C. (1994). Automatic 3D intersubject registration of MR volumetric data in standardized Talairach space. J. Comput. Assist. Tomogr. 18, 192–205. doi: 10.1097/00004728-199403000-00005
Craig, A. D. (2009). How do you feel--now? The anterior insula and human awareness. Nat. Rev. Neurosci. 10, 59–70. doi: 10.1038/nrn2555
Crawford, M. (2003). Hox genes as synchronized temporal regulators: implications for morphological innovation. J. Exp. Zool. B Mol. Dev. Evol. 295, 1–11. doi: 10.1002/jez.b.2
Crouzet, S. M., Busch, N. A., and Ohla, K. (2015). Taste quality decoding parallels taste sensations. Curr. Biol. 25, 890–896. doi: 10.1016/j.cub.2015.01.057
Desimone, R., and Schein, S. J. (1987). Visual properties of neurons in area V4 of the macaque: sensitivity to stimulus form. J. Neurophysiol. 57, 835–868. doi: 10.1152/jn.1987.57.3.835
Deuchert, M., Ruben, J., Schwiemann, J., Meyer, R., Thees, S., Krause, T., et al. (2002). Event-related fMRI of the somatosensory system using electrical finger stimulation. Neuroreport 13, 365–369. doi: 10.1097/00001756-200203040-00023
DeYoe, E. A., Carman, G. J., Bandettini, P., Glickman, S., Wieser, J., Cox, R., et al. (1996). Mapping striate and extrastriate visual areas in human cerebral cortex. Proc. Natl. Acad. Sci. U. S. A. 93, 2382–2386. doi: 10.1073/pnas.93.6.2382
DiCarlo, J. J., and Maunsell, J. H. (2003). Anterior inferotemporal neurons of monkeys engaged in object recognition can be highly sensitive to object retinal position. J. Neurophysiol. 89, 3264–3278. doi: 10.1152/jn.00358.2002
DiCarlo, J. J., Zoccolan, D., and Rust, N. C. (2012). How does the brain solve visual object recognition? Neuron 73, 415–434. doi: 10.1016/j.neuron.2012.01.010
Dick, F., Tierney, A. T., Lutti, A., Josephs, O., Sereno, M. I., and Weiskopf, N. (2012). In vivo functional and myeloarchitectonic mapping of human primary auditory areas. J. Neurosci. 32, 16095–16105. doi: 10.1523/JNEUROSCI.1712-12.2012
Disbrow, E., Roberts, T., and Krubitzer, L. (2000). Somatotopic organization of cortical fields in the lateral sulcus of Homo sapiens: evidence for SII and PV. J. Comp. Neurol. 418, 1–21. doi: 10.1002/(sici)1096-9861(20000228)418:1<1::aid-cne1>3.0.co;2-p
Dougherty, R. F., Ben-Shachar, M., Bammer, R., Brewer, A. A., and Wandell, B. A. (2005). Functional organization of human occipital-callosal fiber tracts. Proc. Natl. Acad. Sci. U. S. A. 102, 7350–7355. doi: 10.1073/pnas.0500003102
Dougherty, R. F., Koch, V. M., Brewer, A. A., Fischer, B., Modersitzki, J., and Wandell, B. A. (2003). Visual field representations and locations of visual areas V1/2/3 in human visual cortex. J. Vis. 3, 586–598. doi: 10.1167/3.10.1
Du, Y. P., Dalwani, M., Wylie, K., Claus, E., and Tregellas, J. R. (2007). Reducing susceptibility artifacts in fMRI using volume-selective z-shim compensation. Magn. Reson. Med. 57, 396–404. doi: 10.1002/mrm.21150
Dumoulin, S. O., and Wandell, B. A. (2008). Population receptive field estimates in human visual cortex. Neuroimage 39, 647–660. doi: 10.1016/j.neuroimage.2007.09.034
Duncan, R. O., and Boynton, G. M. (2007). Tactile hyperacuity thresholds correlate with finger maps in primary somatosensory cortex (S1). Cereb. Cortex 17, 2878–2891. doi: 10.1093/cercor/bhm015
Echevarria-Cooper, S. L., Zhou, G., Zelano, C., Pestilli, F., Parrish, T. B., and Kahnt, T. (2022). Mapping the microstructure and Striae of the human olfactory tract with diffusion MRI. J. Neurosci. 42, 58–68. doi: 10.1523/JNEUROSCI.1552-21.2021
Eickhoff, S. B., Grefkes, C., Zilles, K., and Fink, G. R. (2007). The somatotopic organization of cytoarchitectonic areas on the human parietal operculum. Cereb. Cortex 17, 1800–1811. doi: 10.1093/cercor/bhl090
Eickhoff, S. B., Lotze, M., Wietek, B., Amunts, K., Enck, P., and Zilles, K. (2006). Segregation of visceral and somatosensory afferents: an fMRI and cytoarchitectonic mapping study. Neuroimage 31, 1004–1014. doi: 10.1016/j.neuroimage.2006.01.023
Engel, S. A., Glover, G. H., and Wandell, B. A. (1997). Retinotopic organization in human visual cortex and the spatial precision of functional MRI. Cereb. Cortex 7, 181–192. doi: 10.1093/cercor/7.2.181
Engel, S. A., Rumelhart, D. E., Wandell, B. A., Lee, A. T., Glover, G. H., Chichilnisky, E. J., et al. (1994). fMRI of human visual cortex. Nature 369:525. doi: 10.1038/369525a0
Epstein, R. A., and Baker, C. I. (2019). Scene perception in the human brain. Annu Rev Vis Sci 5, 373–397. doi: 10.1146/annurev-vision-091718-014809
Fattori, P., Pitzalis, S., and Galletti, C. (2009). The cortical visual area V6 in macaque and human brains. J. Physiol. Paris 103, 88–97. doi: 10.1016/j.jphysparis.2009.05.012
Fleischer, J., Breer, H., and Strotmann, J. (2009). Mammalian olfactory receptors. Front. Cell. Neurosci. 3:9. doi: 10.3389/neuro.03.009.2009
Formisano, E., Kim, D. S., Di Salle, F., van de Moortele, P. F., Ugurbil, K., and Goebel, R. (2003). Mirror-symmetric tonotopic maps in human primary auditory cortex. Neuron 40, 859–869. doi: 10.1016/s0896-6273(03)00669-x
Francia, S., and Lodovichi, C. (2021). The role of the odorant receptors in the formation of the sensory map. BMC Biol. 19:174. doi: 10.1186/s12915-021-01116-y
Francis, S. T., Kelly, E. F., Bowtell, R., Dunseath, W. J., Folger, S. E., and McGlone, F. (2000). fMRI of the responses to vibratory stimulation of digit tips. Neuroimage 11, 188–202. doi: 10.1006/nimg.2000.0541
Friedman, R. M., Chen, L. M., and Roe, A. W. (2004). Modality maps within primate somatosensory cortex. Proc. Natl. Acad. Sci. U. S. A. 101, 12724–12729. doi: 10.1073/pnas.0404884101
Gaab, N., Gabrieli, J. D., and Glover, G. H. (2007). Assessing the influence of scanner background noise on auditory processing. II. An fMRI study comparing auditory processing in the absence and presence of recorded scanner noise using a sparse design. Hum. Brain Mapp. 28, 721–732. doi: 10.1002/hbm.20299
Galaburda, A. M., and Pandya, D. N. (1983). The intrinsic architectonic and connectional organization of the superior temporal region of the rhesus monkey. J. Comp. Neurol. 221, 169–184. doi: 10.1002/cne.902210206
Galaburda, A., and Sanides, F. (1980). Cytoarchitectonic organization of the human auditory cortex. J. Comp. Neurol. 190, 597–610. doi: 10.1002/cne.901900312
Gattass, R., Sousa, A. P., and Gross, C. G. (1988). Visuotopic organization and extent of V3 and V4 of the macaque. J. Neurosci. 8, 1831–1845. doi: 10.1523/JNEUROSCI.08-06-01831.1988
Gelnar, P. A., Krauss, B. R., Szeverenyi, N. M., and Apkarian, A. V. (1998). Fingertip representation in the human somatosensory cortex: an fMRI study. Neuroimage 7, 261–283. doi: 10.1006/nimg.1998.0341
Gilad, Y., and Lancet, D. (2003). Population differences in the human functional olfactory repertoire. Mol. Biol. Evol. 20, 307–314. doi: 10.1093/molbev/msg013
Gilaie-Dotan, S., Saygin, A. P., Lorenzi, L. J., Egan, R., Rees, G., and Behrmann, M. (2013). The role of human ventral visual cortex in motion perception. Brain 136, 2784–2798. doi: 10.1093/brain/awt214
Gonzalez Ballester, M. A., Zisserman, A. P., and Brady, M. (2002). Estimation of the partial volume effect in MRI. Med. Image Anal. 6, 389–405. doi: 10.1016/s1361-8415(02)00061-0
Goodale, M. A., and Milner, A. D. (1992). Separate visual pathways for perception and action. Trends Neurosci. 15, 20–25. doi: 10.1016/0166-2236(92)90344-8
Grill-Spector, K., Weiner, K. S., Kay, K., and Gomez, J. (2017). The functional neuroanatomy of human face perception. Annu Rev Vis Sci 3, 167–196. doi: 10.1146/annurev-vision-102016-061214
Grinvald, A., Lieke, E., Frostig, R. D., Gilbert, C. D., and Wiesel, T. N. (1986). Functional architecture of cortex revealed by optical imaging of intrinsic signals. Nature 324, 361–364. doi: 10.1038/324361a0
Haak, K. V., Winawer, J., Harvey, B. M., Renken, R., Dumoulin, S. O., Wandell, B. A., et al. (2012). Connective field modeling. Neuroimage 66, 376–384. doi: 10.1016/j.neuroimage.2012.10.037
Hagen, M. C., Zald, D. H., Thornton, T. A., and Pardo, J. V. (2002). Somatosensory processing in the human inferior prefrontal cortex. J. Neurophysiol. 88, 1400–1406. doi: 10.1152/jn.2002.88.3.1400
Hagler, D. J., Riecke, L., and Sereno, M. I. (2007). Parietal and superior frontal visuospatial maps activated by pointing and saccades. Neuroimage 35, 1562–1577. doi: 10.1016/j.neuroimage.2007.01.033
Hagler, D. J., and Sereno, M. I. (2006). Spatial maps in frontal and prefrontal cortex. Neuroimage 29, 567–577. doi: 10.1016/j.neuroimage.2005.08.058
Hari, R., Karhu, J., Hämäläinen, M., Knuutila, J., Salonen, O., Sams, M., et al. (1993). Functional organization of the human first and second somatosensory cortices: a neuromagnetic study. Eur. J. Neurosci. 5, 724–734. doi: 10.1111/j.1460-9568.1993.tb00536.x
Hashimoto, I., Saito, Y., Iguchi, Y., Kimura, T., Fukushima, T., Terasaki, O., et al. (1999). Distal-proximal somatotopy in the human hand somatosensory cortex: a reappraisal. Exp. Brain Res. 129, 0467–0472. doi: 10.1007/s002210050915
Hedges, S., and Kumar, S. (2003). Genomic clocks and evolutionary timescales. Trends Genet. 19, 200–206. doi: 10.1016/S0168-9525(03)00053-2
Herdener, M., Esposito, F., Scheffler, K., Schneider, P., Logothetis, N. K., Uludag, K., et al. (2013). Spatial representations of temporal and spectral sound cues in human auditory cortex. Cortex 49, 2822–2833. doi: 10.1016/j.cortex.2013.04.003
Hickok, G., and Poeppel, D. (2007). The cortical organization of speech processing. Nat. Rev. Neurosci. 8, 393–402. doi: 10.1038/nrn2113
Hinds, O. P., Rajendran, N., Polimeni, J. R., Augustinack, J. C., Wiggins, G., Wald, L. L., et al. (2008). Accurate prediction of V1 location from cortical folds in a surface coordinate system. Neuroimage 39, 1585–1599. doi: 10.1016/j.neuroimage.2007.10.033
Hiratani, N., and Latham, P. E. (2020). Rapid Bayesian learning in the mammalian olfactory system. Nat. Commun. 11:3845. doi: 10.1038/s41467-020-17490-0
Hoffmann, M. B., Seufert, P. S., and Schmidtborn, L. C. (2007). Perceptual relevance of abnormal visual field representations: static visual field perimetry in human albinism. Br. J. Ophthalmol. 91, 509–513. doi: 10.1136/bjo.2006.094854
Holland, P. W., and Takahashi, T. (2005). The evolution of homeobox genes: implications for the study of brain development. Brain Res. Bull. 66, 484–490. doi: 10.1016/j.brainresbull.2005.06.003
Horton, J. C., Hocking, D. R., and Kiorpes, L. (1997). Pattern of ocular dominance columns and cytochrome oxidase activity in a macaque monkey with naturally occurring anisometropic amblyopia. Vis. Neurosci. 14, 681–689. doi: 10.1017/s0952523800012645
Huang, A. L., Chen, X., Hoon, M. A., Chandrashekar, J., Guo, W., Tränkner, D., et al. (2006). The cells and logic for mammalian sour taste detection. Nature 442, 934–938. doi: 10.1038/nature05084
Huang, R. S., Chen, C. F., Tran, A. T., Holstein, K. L., and Sereno, M. I. (2012). Mapping multisensory parietal face and body areas in humans. Proc. Natl. Acad. Sci. U. S. A. 109, 18114–18119. doi: 10.1073/pnas.1207946109
Huk, A. C., Dougherty, R. F., and Heeger, D. J. (2002). Retinotopy and functional subdivision of human areas MT and MST. J. Neurosci. 22, 7195–7205. doi: 10.1523/JNEUROSCI.22-16-07195.2002
Humphries, C., Liebenthal, E., and Binder, J. R. (2010). Tonotopic organization of human auditory cortex. Neuroimage 50, 1202–1211. doi: 10.1016/j.neuroimage.2010.01.046
Hutton, C., Bork, A., Josephs, O., Deichmann, R., Ashburner, J., and Turner, R. (2002). Image distortion correction in fMRI: a quantitative evaluation. Neuroimage 16, 217–240. doi: 10.1006/nimg.2001.1054
Imai, T., Sakano, H., and Vosshall, L. B. (2010). Topographic mapping--the olfactory system. Cold Spring Harb. Perspect. Biol. 2:a001776. doi: 10.1101/cshperspect.a001776
Imamura, F., Ito, A., and LaFever, B. J. (2020). Subpopulations of projection neurons in the olfactory bulb. Front Neural Circuits 14:561822. doi: 10.3389/fncir.2020.561822
Iwamura, Y., Tanaka, M., Sakamoto, M., and Hikosaka, O. (1993). Rostrocaudal gradients in the neuronal receptive field complexity in the finger region of the alert monkey's postcentral gyrus. Exp. Brain Res. 92, 360–368. doi: 10.1007/BF00229023
Jain, N., Qi, H. X., Collins, C. E., and Kaas, J. H. (2008). Large-scale reorganization in the somatosensory cortex and thalamus after sensory loss in macaque monkeys. J. Neurosci. 28, 11042–11060. doi: 10.1523/JNEUROSCI.2334-08.2008
Joly, O., Baumann, S., Balezeau, F., Thiele, A., and Griffiths, T. D. (2014). Merging functional and structural properties of the monkey auditory cortex. Front. Neurosci. 8:198. doi: 10.3389/fnins.2014.00198
Kaas, J. H. (1991). Plasticity of sensory and motor maps in adult mammals. Annu. Rev. Neurosci. 14, 137–167. doi: 10.1146/annurev.ne.14.030191.001033
Kaas, J. H. (1997). Topographic maps are fundamental to sensory processing. Brain Res. Bull. 44, 107–112. doi: 10.1016/S0361-9230(97)00094-4
Kaas, J. H. (2012). “Somatosensory system” in The human nervous system. ed. G. P. J. K. Mai. 3rd ed (Cambridge, Massachusetts: Academic Press), 1074–1109.
Kaas, J. H., and Hackett, T. A. (2000). Subdivisions of auditory cortex and processing streams in primates. Proc. Natl. Acad. Sci. U. S. A. 97, 11793–11799. doi: 10.1073/pnas.97.22.11793
Kaas, J. H., Merzenich, M. M., and Killackey, H. P. (1983). The reorganization of somatosensory cortex following peripheral nerve damage in adult and developing mammals. Annu. Rev. Neurosci. 6, 325–356. doi: 10.1146/annurev.ne.06.030183.001545
Kaas, J. H., Qi, H. X., and Iyengar, S. (2006). Cortical network for representing the teeth and tongue in primates. Anat. Rec. A Discov. Mol. Cell. Evol. Biol. 288A, 182–190. doi: 10.1002/ar.a.20267
Kajikawa, Y., Frey, S., Ross, D., Falchier, A., Hackett, T. A., and Schroeder, C. E. (2015). Auditory properties in the parabelt regions of the superior temporal gyrus in the awake macaque monkey: an initial survey. J. Neurosci. 35, 4140–4150. doi: 10.1523/JNEUROSCI.3556-14.2015
Kastner, S., DeSimone, K., Konen, C. S., Szczepanski, S. M., Weiner, K. S., and Schneider, K. A. (2007). Topographic maps in human frontal cortex revealed in memory-guided saccade and spatial working-memory tasks. J. Neurophysiol. 97, 3494–3507. doi: 10.1152/jn.00010.2007
Kato, H. K., Chu, M. W., Isaacson, J. S., and Komiyama, T. (2012). Dynamic sensory representations in the olfactory bulb: modulation by wakefulness and experience. Neuron 76, 962–975. doi: 10.1016/j.neuron.2012.09.037
Keller, A., Zhuang, H., Chi, Q., Vosshall, L. B., and Matsunami, H. (2007). Genetic variation in a human odorant receptor alters odour perception. Nature 449, 468–472. doi: 10.1038/nature06162
Kmita, M., and Duboule, D. (2003). Organizing axes in time and space; 25 years of colinear tinkering. Science 301, 331–333. doi: 10.1126/science.1085753
Kolasinski, J., Makin, T. R., Jbabdi, S., Clare, S., Stagg, C. J., and Johansen-Berg, H. (2016a). Investigating the stability of Fine-grain digit Somatotopy in individual human participants. J. Neurosci. 36, 1113–1127. doi: 10.1523/JNEUROSCI.1742-15.2016
Kolasinski, J., Makin, T. R., Logan, J. P., Jbabdi, S., Clare, S., Stagg, C. J., et al. (2016b). Perceptually relevant remapping of human somatotopy in 24 hours. Elife 5:17280. doi: 10.7554/eLife.17280
Kolster, H., Mandeville, J. B., Arsenault, J. T., Ekstrom, L. B., Wald, L. L., and Vanduffel, W. (2009). Visual field map clusters in macaque extrastriate visual cortex. J. Neurosci. 29, 7031–7039. doi: 10.1523/JNEUROSCI.0518-09.2009
Kolster, H., Peeters, R., and Orban, G. A. (2010). The retinotopic organization of the human middle temporal area MT/V5 and its cortical neighbors. J. Neurosci. 30, 9801–9820. doi: 10.1523/JNEUROSCI.2069-10.2010
Konen, C. S., and Kastner, S. (2008). Representation of eye movements and stimulus motion in topographically organized areas of human posterior parietal cortex. J. Neurosci. 28, 8361–8375. doi: 10.1523/JNEUROSCI.1930-08.2008
Koulakov, A. A., and Chklovskii, D. B. (2001). Orientation preference patterns in mammalian visual cortex: a wire length minimization approach. Neuron 29, 519–527. doi: 10.1016/S0896-6273(01)00223-9
Krubitzer, L. (2007). The magnificent compromise: cortical field evolution in mammals. Neuron 56, 201–208. doi: 10.1016/j.neuron.2007.10.002
Krubitzer, L., Clarey, J., Tweedale, R., Elston, G., and Calford, M. (1995a). A redefinition of somatosensory areas in the lateral sulcus of macaque monkeys. J. Neurosci. 15, 3821–3839. doi: 10.1523/JNEUROSCI.15-05-03821.1995
Krubitzer, L., Manger, P., Pettigrew, J., and Calford, M. (1995b). Organization of somatosensory cortex in monotremes: in search of the prototypical plan. J. Comp. Neurol. 351, 261–306. doi: 10.1002/cne.903510206
Krubitzer, L. A., and Seelke, A. M. (2012). Cortical evolution in mammals: the bane and beauty of phenotypic variability. Proc. Natl. Acad. Sci. 109, 10647–10654. doi: 10.1073/pnas.1201891109
Kurth, R., Villringer, K., Curio, G., Wolf, K. J., Krause, T., Repenthin, J., et al. (2000). fMRI shows multiple somatotopic digit representations in human primary somatosensory cortex. Neuroreport 11, 1487–1491. doi: 10.1097/00001756-200005150-00026
Kusmierek, P., and Rauschecker, J. P. (2009). Functional specialization of medial auditory belt cortex in the alert rhesus monkey. J. Neurophysiol. 102, 1606–1622. doi: 10.1152/jn.00167.2009
Lage-Castellanos, A., De Martino, F., Ghose, G. M., Gulban, O. F., and Moerel, M. (2023). Selective attention sharpens population receptive fields in human auditory cortex. Cereb. Cortex 33, 5395–5408. doi: 10.1093/cercor/bhac427
Landi, S. M., Viswanathan, P., Serene, S., and Freiwald, W. A. (2021). A fast link between face perception and memory in the temporal pole. Science 373, 581–585. doi: 10.1126/science.abi6671
Langner, G. (1992). Periodicity coding in the auditory system. Hear. Res. 60, 115–142. doi: 10.1016/0378-5955(92)90015-F
Langner, G., Albert, M., and Briede, T. (2002). Temporal and spatial coding of periodicity information in the inferior colliculus of awake chinchilla (Chinchilla laniger). Hear. Res. 168, 110–130. doi: 10.1016/S0378-5955(02)00367-2
Langner, G., Sams, M., Heil, P., and Schulze, H. (1997). Frequency and periodicity are represented in orthogonal maps in the human auditory cortex: evidence from magnetoencephalography. J. Comp. Physiol. A 181, 665–676. doi: 10.1007/s003590050148
Lappin, T. R., Grier, D. G., Thompson, A., and Halliday, H. L. (2006). HOX genes: seductive science, mysterious mechanisms. Ulster Med. J. 75, 23–31.
Larsson, J., and Heeger, D. J. (2006). Two retinotopic visual areas in human lateral occipital cortex. J. Neurosci. 26, 13128–13142. doi: 10.1523/JNEUROSCI.1657-06.2006
Larsson, J., Heeger, D. J., and Landy, M. S. (2010). Orientation selectivity of motion-boundary responses in human visual cortex. J. Neurophysiol. 104, 2940–2950. doi: 10.1152/jn.00400.2010
Larsson, J., Landy, M. S., and Heeger, D. J. (2006). Orientation-selective adaptation to first- and second-order patterns in human visual cortex. J. Neurophysiol. 95, 862–881. doi: 10.1152/jn.00668.2005
Lauritzen, T. Z., D'Esposito, M., Heeger, D. J., and Silver, M. A. (2009). Top-down flow of visual spatial attention signals from parietal to occipital cortex. J. Vis. 9:18.1. doi: 10.1167/9.13.18
Lehky, S. R., and Sereno, A. B. (2007). Comparison of shape encoding in primate dorsal and ventral visual pathways. J. Neurophysiol. 97, 307–319. doi: 10.1152/jn.00168.2006
Lehky, S. R., and Sereno, A. B. (2011). Population coding of visual space: modeling. Front. Comput. Neurosci. 4:155. doi: 10.3389/fncom.2010.00155
Lehky, S. R., Sereno, M. E., and Sereno, A. B. (2015). Characteristics of eye-position gain field populations determine geometry of visual space. Front. Integr. Neurosci. 9:72. doi: 10.3389/fnint.2015.00072
Leonard, C. M., Puranik, C., Kuldau, J. M., and Lombardino, L. J. (1998). Normal variation in the frequency and location of human auditory cortex landmarks. Heschl's gyrus: where is it? Cereb. Cortex 8, 397–406. doi: 10.1093/cercor/8.5.397
Livingstone, M. S., and Hubel, D. H. (1984). Specificity of intrinsic connections in primate primary visual cortex. J. Neurosci. 4, 2830–2835. doi: 10.1523/JNEUROSCI.04-11-02830.1984
Lodovichi, C. (2020). Role of axonal odorant receptors in olfactory topography. Neurosci Insights 15:2633105520923411. doi: 10.1177/2633105520923411
Logothetis, N. K. (2002). The neural basis of the blood-oxygen-level-dependent functional magnetic resonance imaging signal. Philos. Trans. R. Soc. Lond. B Biol. Sci. 357, 1003–1037. doi: 10.1098/rstb.2002.1114
Logothetis, N. K., and Wandell, B. A. (2004). Interpreting the BOLD signal. Annu. Rev. Physiol. 66, 735–769. doi: 10.1146/annurev.physiol.66.082602.092845
Lumpkin, E. A., and Bautista, D. M. (2005). Feeling the pressure in mammalian somatosensation. Curr. Opin. Neurobiol. 15, 382–388. doi: 10.1016/j.conb.2005.06.005
Lundstrom, J. N., Boesveldt, S., and Albrecht, J. (2011). Central processing of the chemical senses: an overview. ACS Chem. Nerosci. 2, 5–16. doi: 10.1021/cn1000843
Luo, L., and Flanagan, J. G. (2007). Development of continuous and discrete neural maps. Neuron 56, 284–300. doi: 10.1016/j.neuron.2007.10.014
Ma, L., Qiu, Q., Gradwohl, S., Scott, A., Yu, E. Q., Alexander, R., et al. (2012). Distributed representation of chemical features and tunotopic organization of glomeruli in the mouse olfactory bulb. Proc. Natl. Acad. Sci. U. S. A. 109, 5481–5486. doi: 10.1073/pnas.1117491109
Maldjian, J. A., Gottschalk, A., Patel, R. S., Detre, J. A., and Alsop, D. C. (1999). The sensory somatotopic map of the human hand demonstrated at 4 tesla. Neuroimage 10, 55–62. doi: 10.1006/nimg.1999.0448
Maldonado, P. E., Godecke, I., Gray, C. M., and Bonhoeffer, T. (1997). Orientation selectivity in pinwheel centers in cat striate cortex. Science 276, 1551–1555. doi: 10.1126/science.276.5318.1551
Mancini, F., Haggard, P., Iannetti, G. D., Longo, M. R., and Sereno, M. I. (2012). Fine-grained nociceptive maps in primary somatosensory cortex. J. Neurosci. 32, 17155–17162. doi: 10.1523/JNEUROSCI.3059-12.2012
Menashe, I., Man, O., Lancet, D., and Gilad, Y. (2003). Different noses for different people. Nat. Genet. 34, 143–144. doi: 10.1038/ng1160
Merigan, W. H., and Maunsell, J. H. (1993). How parallel are the primate visual pathways? Annu. Rev. Neurosci. 16, 369–402. doi: 10.1146/annurev.ne.16.030193.002101
Merzenich, M. M., and Brugge, J. F. (1973). Representation of the cochlear partition of the superior temporal plane of the macaque monkey. Brain Res. 50, 275–296. doi: 10.1016/0006-8993(73)90731-2
Mikami, A., Newsome, W. T., and Wurtz, R. H. (1986). Motion selectivity in macaque visual cortex. II. Spatiotemporal range of directional interactions in MT and V1. J. Neurophysiol. 55, 1328–1339. doi: 10.1152/jn.1986.55.6.1328
Mishkin, M., and Ungerleider, L. G. (1982). Contribution of striate inputs to the visuospatial functions of parieto-preoccipital cortex in monkeys. Behav. Brain Res. 6, 57–77. doi: 10.1016/0166-4328(82)90081-x
Miskovic, V., and Anderson, A. K. (2018). Modality general and modality specific coding of hedonic valence. Curr. Opin. Behav. Sci. 19, 91–97. doi: 10.1016/j.cobeha.2017.12.012
Mitchison, G. (1991). Neuronal branching patterns and the economy of cortical wiring. Proc. Biol. Sci. 245, 151–158. doi: 10.1098/rspb.1991.0102
Miyamichi, K., Serizawa, S., Kimura, H. M., and Sakano, H. (2005). Continuous and overlapping expression domains of odorant receptor genes in the olfactory epithelium determine the dorsal/ventral positioning of glomeruli in the olfactory bulb. J. Neurosci. 25, 3586–3592. doi: 10.1523/JNEUROSCI.0324-05.2005
Moerel, M., De Martino, F., and Formisano, E. (2012). Processing of natural sounds in human auditory cortex: tonotopy, spectral tuning, and relation to voice sensitivity. J. Neurosci. 32, 14205–14216. doi: 10.1523/JNEUROSCI.1388-12.2012
Mombaerts, P., Wang, F., Dulac, C., Chao, S. K., Nemes, A., Mendelsohn, M., et al. (1996). Visualizing an olfactory sensory map. Cells 87, 675–686. doi: 10.1016/s0092-8674(00)81387-2
Montaser-Kouhsari, L., Landy, M. S., Heeger, D. J., and Larsson, J. (2007). Orientation-selective adaptation to illusory contours in human visual cortex. J. Neurosci. 27, 2186–2195. doi: 10.1523/JNEUROSCI.4173-06.2007
Moradi, F., and Heeger, D. J. (2009). Inter-ocular contrast normalization in human visual cortex. J. Vis. 9:13.1. doi: 10.1167/9.3.13
Morgan, C., and Schwarzkopf, D. S. (2019). Comparison of human population receptive field estimates between scanners and the effect of temporal filtering. F1000Res 8:1681. doi: 10.12688/f1000research.20496.2
Mori, K., Nagao, H., and Yoshihara, Y. (1999). The olfactory bulb: coding and processing of odor molecule information. Science 286, 711–715. doi: 10.1126/science.286.5440.711
Morosan, P., Rademacher, J., Schleicher, A., Amunts, K., Schormann, T., and Zilles, K. (2001). Human primary auditory cortex: cytoarchitectonic subdivisions and mapping into a spatial reference system. Neuroimage 13, 684–701. doi: 10.1006/nimg.2000.0715
Movshon, J. A., and Newsome, W. T. (1996). Visual response properties of striate cortical neurons projecting to area MT in macaque monkeys. J. Neurosci. 16, 7733–7741. doi: 10.1523/JNEUROSCI.16-23-07733.1996
Mueller, K. L., Hoon, M. A., Erlenbach, I., Chandrashekar, J., Zuker, C. S., and Ryba, N. J. (2005). The receptors and coding logic for bitter taste. Nature 434, 225–229. doi: 10.1038/nature03352
Murthy, V. N. (2011). Olfactory maps in the brain. Annu. Rev. Neurosci. 34, 233–258. doi: 10.1146/annurev-neuro-061010-113738
Nakamura, H., Gattass, R., Desimone, R., and Ungerleider, L. G. (1993). The modular organization of projections from areas V1 and V2 to areas V4 and TEO in macaques. J. Neurosci. 13, 3681–3691. doi: 10.1523/JNEUROSCI.13-09-03681.1993
Nestares, O., and Heeger, D. J. (2000). Robust multiresolution alignment of MRI brain volumes. Magn. Reson. Med. 43, 705–715. doi: 10.1002/(SICI)1522-2594(200005)43:5<705::AID-MRM13>3.0.CO;2-R[pii]
Newsome, W. T., Mikami, A., and Wurtz, R. H. (1986). Motion selectivity in macaque visual cortex. III. Psychophysics and physiology of apparent motion. J. Neurophysiol. 55, 1340–1351. doi: 10.1152/jn.1986.55.6.1340
Ohki, K., Chung, S., Kara, P., Hubener, M., Bonhoeffer, T., and Reid, R. C. (2006). Highly ordered arrangement of single neurons in orientation pinwheels. Nature 442, 925–928. doi: 10.1038/nature05019
Olofsson, J. K., and Freiherr, J. (2019). Neuroimaging of smell and taste. Handb. Clin. Neurol. 164, 263–282. doi: 10.1016/B978-0-444-63855-7.00017-4
Pandya, D. N., and Sanides, F. (1973). Architectonic parcellation of the temporal operculum in rhesus monkey and its projection pattern. Z. Anat. Entwicklungsgesch. 139, 127–161. doi: 10.1007/BF00523634
Peelle, J. E. (2014). Methodological challenges and solutions in auditory functional magnetic resonance imaging. Front. Neurosci. 8:253. doi: 10.3389/fnins.2014.00253
Penfield, W., and Boldrey, E. (1937). Somatic motor and sensory representation in the cerebral cortex of man as studied by electrical stimulation. Brain 60, 389–443. doi: 10.1093/brain/60.4.389
Peng, Y., Gillis-Smith, S., Jin, H., Trankner, D., Ryba, N. J., and Zuker, C. S. (2015). Sweet and bitter taste in the brain of awake behaving animals. Nature 527, 512–515. doi: 10.1038/nature15763
Petkov, C. I., Kayser, C., Augath, M., and Logothetis, N. K. (2009). Optimizing the imaging of the monkey auditory cortex: sparse vs. continuous fMRI. Magn. Reson. Imaging 27, 1065–1073. doi: 10.1016/j.mri.2009.01.018
Pitzalis, S., Bozzacchi, C., Bultrini, A., Fattori, P., Galletti, C., and Di Russo, F. (2013). Parallel motion signals to the medial and lateral motion areas V6 and MT+. Neuroimage 67, 89–100. doi: 10.1016/j.neuroimage.2012.11.022
Pitzalis, S., Fattori, P., and Galletti, C. (2015). The human cortical areas V6 and V6A. Vis. Neurosci. 32:E007. doi: 10.1017/S0952523815000048
Pitzalis, S., Galletti, C., Huang, R. S., Patria, F., Committeri, G., Galati, G., et al. (2006). Wide-field retinotopy defines human cortical visual area v6. J. Neurosci. 26, 7962–7973. doi: 10.1523/JNEUROSCI.0178-06.2006
Pitzalis, S., Sereno, M. I., Committeri, G., Fattori, P., Galati, G., Patria, F., et al. (2010). Human v6: the medial motion area. Cereb. Cortex 20, 411–424. doi: 10.1093/cercor/bhp112
Pleger, B., and Villringer, A. (2013). The human somatosensory system: from perception to decision making. Prog. Neurobiol. 103, 76–97. doi: 10.1016/j.pneurobio.2012.10.002
Powell, T. P., and Mountcastle, V. B. (1959). Some aspects of the functional organization of the cortex of the postcentral gyrus of the monkey: a correlation of findings obtained in a single unit analysis with cytoarchitecture. Bull. Johns Hopkins Hosp. 105, 133–162.
Press, W. A., Brewer, A. A., Dougherty, R. F., Wade, A. R., and Wandell, B. A. (2001). Visual areas and spatial summation in human visual cortex. Vision Res. 41, 1321–1332. doi: 10.1016/S0042-6989(01)00074-8
Prinster, A., Cantone, E., Verlezza, V., Magliulo, M., Sarnelli, G., Iengo, M., et al. (2017). Cortical representation of different taste modalities on the gustatory cortex: a pilot study. PloS One 12:e0190164. doi: 10.1371/journal.pone.0190164
Qasim, S. E., Fried, I., and Jacobs, J. (2021). Phase precession in the human hippocampus and entorhinal cortex. Cells 184, 3242–3255.e10. doi: 10.1016/j.cell.2021.04.017
Qi, H. X., Kaas, J. H., and Reed, J. L. (2014). The reactivation of somatosensory cortex and behavioral recovery after sensory loss in mature primates. Front. Syst. Neurosci. 8:84. doi: 10.3389/fnsys.2014.00084
Rademacher, J., Caviness, V. S. Jr., Steinmetz, H., and Galaburda, A. M. (1993). Topographical variation of the human primary cortices: implications for neuroimaging, brain mapping, and neurobiology. Cereb. Cortex 3, 313–329. doi: 10.1093/cercor/3.4.313
Rademacher, J., Morosan, P., Schormann, T., Schleicher, A., Werner, C., Freund, H. J., et al. (2001). Probabilistic mapping and volume measurement of human primary auditory cortex. Neuroimage 13, 669–683. doi: 10.1006/nimg.2000.0714
Rauschecker, J. P., and Tian, B. (2004). Processing of band-passed noise in the lateral auditory belt cortex of the rhesus monkey. J. Neurophysiol. 91, 2578–2589. doi: 10.1152/jn.00834.2003
Rauschecker, J. P., Tian, B., and Hauser, M. (1995). Processing of complex sounds in the macaque nonprimary auditory cortex. Science 268, 111–114. doi: 10.1126/science.7701330
Ress, D., and Chandrasekaran, B. (2013). Tonotopic organization in the depth of human inferior colliculus. Front. Hum. Neurosci. 7:586. doi: 10.3389/fnhum.2013.00586
Ressler, K. J., Sullivan, S. L., and Buck, L. B. (1994). Information coding in the olfactory system: evidence for a stereotyped and highly organized epitope map in the olfactory bulb. Cells 79, 1245–1255. doi: 10.1016/0092-8674(94)90015-9
Rolls, E. T. (2006). Brain mechanisms underlying flavour and appetite. Philos. Trans. R. Soc. Lond. B Biol. Sci. 361, 1123–1136. doi: 10.1098/rstb.2006.1852
Rolls, E. T. (2011). Taste, olfactory and food texture reward processing in the brain and obesity. Int. J. Obes. (Lond) 35, 550–561. doi: 10.1038/ijo.2010.155
Roper, S. D. (2013). Taste buds as peripheral chemosensory processors. Semin. Cell Dev. Biol. 24, 71–79. doi: 10.1016/j.semcdb.2012.12.002
Rothschild, G., and Mizrahi, A. (2015). Global order and local disorder in brain maps. Annu. Rev. Neurosci. 38, 247–268. doi: 10.1146/annurev-neuro-071013-014038
Roux, F. E., Djidjeli, I., and Durand, J. B. (2018). Functional architecture of the somatosensory homunculus detected by electrostimulation. J. Physiol. 596, 941–956. doi: 10.1113/JP275243
Ruben, J., Schwiemann, J., Deuchert, M., Meyer, R., Krause, T., Curio, G., et al. (2001). Somatotopic organization of human secondary somatosensory cortex. Cereb. Cortex 11, 463–473. doi: 10.1093/cercor/11.5.463
Saadon-Grosman, N., Arzy, S., and Loewenstein, Y. (2020a). Hierarchical cortical gradients in somatosensory processing. Neuroimage 222:117257. doi: 10.1016/j.neuroimage.2020.117257
Saadon-Grosman, N., Loewenstein, Y., and Arzy, S. (2020b). The 'creatures' of the human cortical somatosensory system. Brain Commun 2:fcaa003. doi: 10.1093/braincomms/fcaa003
Saenz, M., and Fine, I. (2010). Topographic organization of V1 projections through the corpus callosum in humans. Neuroimage 52, 1224–1229. doi: 10.1016/j.neuroimage.2010.05.060
Saenz, M., and Langers, D. R. (2014). Tonotopic mapping of human auditory cortex. Hear. Res. 307, 42–52. doi: 10.1016/j.heares.2013.07.016
Sanchez Panchuelo, R. M., Besle, J., Schluppeck, D., Humberstone, M., and Francis, S. (2018). Somatotopy in the human somatosensory system. Front. Hum. Neurosci. 12:235. doi: 10.3389/fnhum.2018.00235
Sánchez-Panchuelo, R. M., Besle, J., Mougin, O., Gowland, P., Bowtell, R., Schluppeck, D., et al. (2014). Regional structural differences across functionally parcellated Brodmann areas of human primary somatosensory cortex. Neuroimage 93, 221–230. doi: 10.1016/j.neuroimage.2013.03.044
Sanchez-Panchuelo, R. M., Francis, S., Bowtell, R., and Schluppeck, D. (2010). Mapping human somatosensory cortex in individual subjects with 7T functional MRI. J. Neurophysiol. 103, 2544–2556. doi: 10.1152/jn.01017.2009
Saygin, A. P., and Sereno, M. I. (2008). Retinotopy and attention in human occipital, temporal, parietal, and frontal cortex. Cereb. Cortex 18, 2158–2168. doi: 10.1093/cercor/bhm242
Scarff, C. J., Dort, J. C., Eggermont, J. J., and Goodyear, B. G. (2004). The effect of MR scanner noise on auditory cortex activity using fMRI. Hum. Brain Mapp. 22, 341–349. doi: 10.1002/hbm.20043
Schaffer, E. S., Stettler, D. D., Kato, D., Choi, G. B., Axel, R., and Abbott, L. F. (2018). Odor perception on the two sides of the brain: consistency despite randomness. Neuron 98, 736–742.e3. doi: 10.1016/j.neuron.2018.04.004
Schellekens, W., Petridou, N., and Ramsey, N. F. (2018). Detailed somatotopy in primary motor and somatosensory cortex revealed by Gaussian population receptive fields. Neuroimage 179, 337–347. doi: 10.1016/j.neuroimage.2018.06.062
Schira, M. M., Tyler, C. W., Breakspear, M., and Spehar, B. (2009). The foveal confluence in human visual cortex. J. Neurosci. 29, 9050–9058. doi: 10.1523/JNEUROSCI.1760-09.2009
Schira, M. M., Tyler, C. W., Spehar, B., and Breakspear, M. (2010). Modeling magnification and anisotropy in the primate foveal confluence. PLoS Comput. Biol. 6:e1000651. doi: 10.1371/journal.pcbi.1000651
Schluppeck, D., Glimcher, P., and Heeger, D. J. (2005). Topographic organization for delayed saccades in human posterior parietal cortex. J. Neurophysiol. 94, 1372–1384. doi: 10.1152/jn.01290.2004
Schonwiesner, M., von Cramon, D. Y., and Rubsamen, R. (2002). Is it tonotopy after all? Neuroimage 17, 1144–1161. doi: 10.1006/nimg.2002.1250
Schoonover, C. E., Ohashi, S. N., Axel, R., and Fink, A. J. P. (2021). Representational drift in primary olfactory cortex. Nature 594, 541–546. doi: 10.1038/s41586-021-03628-7
Schreiner, C. E., and Langner, G. (1988). Periodicity coding in the inferior colliculus of the cat. II. Topographical organization. J. Neurophysiol. 60, 1823–1840. doi: 10.1152/jn.1988.60.6.1823
Schulze, H., Hess, A., Ohl, F. W., and Scheich, H. (2002). Superposition of horseshoe-like periodicity and linear tonotopic maps in auditory cortex of the Mongolian gerbil. Eur. J. Neurosci. 15, 1077–1084. doi: 10.1046/j.1460-9568.2002.01935.x
Schummers, J., Marino, J., and Sur, M. (2002). Synaptic integration by V1 neurons depends on location within the orientation map. Neuron 36, 969–978. doi: 10.1016/s0896-6273(02)01012-7
Schwarzkopf, D. S., and Rees, G. (2013). Subjective size perception depends on central visual cortical magnification in human v1. PloS One 8:e60550. doi: 10.1371/journal.pone.0060550
Secundo, L., Snitz, K., Weissler, K., Pinchover, L., Shoenfeld, Y., Loewenthal, R., et al. (2015). Individual olfactory perception reveals meaningful nonolfactory genetic information. Proc. Natl. Acad. Sci. U. S. A. 112, 8750–8755. doi: 10.1073/pnas.1424826112
Sereno, M. I., Dale, A. M., Reppas, J. B., Kwong, K. K., Belliveau, J. W., Brady, T. J., et al. (1995). Borders of multiple visual areas in humans revealed by functional magnetic resonance imaging. Science 268, 889–893. doi: 10.1126/science.7754376
Sereno, M. I., and Huang, R. S. (2006). A human parietal face area contains aligned head-centered visual and tactile maps. Nat. Neurosci. 9, 1337–1343. doi: 10.1038/nn1777
Sereno, M. I., Pitzalis, S., and Martinez, A. (2001). Mapping of contralateral space in retinotopic coordinates by a parietal cortical area in humans. Science 294, 1350–1354. doi: 10.1126/science.1063695
Shamma, S. (2001). On the role of space and time in auditory processing. Trends Cogn. Sci. 5, 340–348. doi: 10.1016/S1364-6613(00)01704-6
Shapley, R., Hawken, M., and Xing, D. (2007). The dynamics of visual responses in the primary visual cortex. Prog. Brain Res. 165, 21–32. doi: 10.1016/S0079-6123(06)65003-6
Silver, M. A., and Kastner, S. (2009). Topographic maps in human frontal and parietal cortex. Trends Cogn. Sci. 13, 488–495. doi: 10.1016/j.tics.2009.08.005
Silver, M. A., Ress, D., and Heeger, D. J. (2005). Topographic maps of visual spatial attention in human parietal cortex. J. Neurophysiol. 94, 1358–1371. doi: 10.1152/jn.01316.2004
Smirnakis, S. M., Brewer, A. A., Schmid, M. C., Tolias, A. S., Schüz, A., Augath, M., et al. (2005). Lack of long-term cortical reorganization after macaque retinal lesions. Nature 435, 300–307. doi: 10.1038/nature03495
Smith, A. T., Greenlee, M. W., Singh, K. D., Kraemer, F. M., and Hennig, J. (1998). The processing of first- and second-order motion in human visual cortex assessed by functional magnetic resonance imaging (fMRI). J. Neurosci. 18, 3816–3830. doi: 10.1523/JNEUROSCI.18-10-03816.1998
Snyder, P. J., and Whitaker, H. A. (2013). Neurologic heuristics and artistic whimsy: the cerebral cartography of Wilder Penfield. J. Hist. Neurosci. 22, 277–291. doi: 10.1080/0964704X.2012.757965
Sosulski, D. L., Bloom, M. L., Cutforth, T., Axel, R., and Datta, S. R. (2011). Distinct representations of olfactory information in different cortical centres. Nature 472, 213–216. doi: 10.1038/nature09868
Stevenson, R. J., Miller, L. A., and McGrillen, K. (2013). The lateralization of gustatory function and the flow of information from tongue to cortex. Neuropsychologia 51, 1408–1416. doi: 10.1016/j.neuropsychologia.2013.04.010
Sugita, M., and Shiba, Y. (2005). Genetic tracing shows segregation of taste neuronal circuitries for bitter and sweet. Science 309, 781–785. doi: 10.1126/science.1110787
Sweet, R. A., Dorph-Petersen, K. A., and Lewis, D. A. (2005). Mapping auditory core, lateral belt, and parabelt cortices in the human superior temporal gyrus. J. Comp. Neurol. 491, 270–289. doi: 10.1002/cne.20702
Swisher, J. D., Halko, M. A., Merabet, L. B., McMains, S. A., and Somers, D. C. (2007). Visual topography of human intraparietal sulcus. J. Neurosci. 27, 5326–5337. doi: 10.1523/JNEUROSCI.0991-07.2007
Szczepanski, S. M., Konen, C. S., and Kastner, S. (2010). Mechanisms of spatial attention control in frontal and parietal cortex. J. Neurosci. 30, 148–160. doi: 10.1523/JNEUROSCI.3862-09.2010
Talairach, J., and Tournoux, P. (1988). Col-planar Stereotax atlas of the human brain. New York: Thieme Medical Publishers.
Talavage, T. M., Gonzalez-Castillo, J., and Scott, S. K. (2014). Auditory neuroimaging with fMRI and PET. Hear. Res. 307, 4–15. doi: 10.1016/j.heares.2013.09.009
Talavage, T. M., Sereno, M. I., Melcher, J. R., Ledden, P. J., Rosen, B. R., and Dale, A. M. (2004). Tonotopic organization in human auditory cortex revealed by progressions of frequency sensitivity. J. Neurophysiol. 91, 1282–1296. doi: 10.1152/jn.01125.2002
Tanaka, K., Saito, H., Fukada, Y., and Moriya, M. (1991). Coding visual images of objects in the inferotemporal cortex of the macaque monkey. J. Neurophysiol. 66, 170–189. doi: 10.1152/jn.1991.66.1.170
Thomas, J. M., Huber, E., Stecker, G. C., Boynton, G. M., Saenz, M., and Fine, I. (2015). Population receptive field estimates of human auditory cortex. Neuroimage 105, 428–439. doi: 10.1016/j.neuroimage.2014.10.060
Tian, B., and Rauschecker, J. P. (2004). Processing of frequency-modulated sounds in the lateral auditory belt cortex of the rhesus monkey. J. Neurophysiol. 92, 2993–3013. doi: 10.1152/jn.00472.2003
Tootell, R. B., Mendola, J. D., Hadjikhani, N. K., Ledden, P. J., Liu, A. K., Reppas, J. B., et al. (1997). Functional analysis of V3A and related areas in human visual cortex. J. Neurosci. 17, 7060–7078. doi: 10.1523/JNEUROSCI.17-18-07060.1997
Tyler, C. W., and Wade, A. R. (2005). Extended concepts of occipital retinotopy. Curr Med Imaging Rev 1, 319–329. doi: 10.2174/157340505774574772
Van Essen, D. C. (2003). “Organization of Visual Areas in macaque and human cerebral cortex” in The visual neurosciences. eds. L. M. Chalupa and J. S. Werner (Boston: Bradford Books), 507–521.
Vassar, R., Chao, S. K., Sitcheran, R., Nunez, J. M., Vosshall, L. B., and Axel, R. (1994). Topographic organization of sensory projections to the olfactory bulb. Cells 79, 981–991. doi: 10.1016/0092-8674(94)90029-9
Vassar, R., Ngai, J., and Axel, R. (1993). Spatial segregation of odorant receptor expression in the mammalian olfactory epithelium. Cells 74, 309–318. doi: 10.1016/0092-8674(93)90422-m
Wade, A. R., Brewer, A. A., Rieger, J. W., and Wandell, B. A. (2002). Functional measurements of human ventral occipital cortex: retinotopy and colour. Philos. Trans. R. Soc. Lond. B Biol. Sci. 357, 963–973. doi: 10.1098/rstb.2002.1108
Wandell, B. A., Brewer, A. A., and Dougherty, R. F. (2005). Visual field map clusters in human cortex. Philos. Trans. R. Soc. Lond. B Biol. Sci. 360, 693–707. doi: 10.1098/rstb.2005.1628
Wandell, B. A., Dumoulin, S. O., and Brewer, A. A. (2007). Visual field maps in human cortex. Neuron 56, 366–383. doi: 10.1016/j.neuron.2007.10.012
Wandell, B. A., and Smirnakis, S. M. (2009). Plasticity and stability of visual field maps in adult primary visual cortex. Nat. Rev. Neurosci. 10, 873–884. doi: 10.1038/nrn2741
Wessinger, C. M., VanMeter, J., Tian, B., Van Lare, J., Pekar, J., and Rauschecker, J. P. (2001). Hierarchical organization of the human auditory cortex revealed by functional magnetic resonance imaging. J. Cogn. Neurosci. 13, 1–7. doi: 10.1162/089892901564108
Willoughby, W. R., Thoenes, K., and Bolding, M. (2020). Somatotopic arrangement of the human primary somatosensory cortex derived from functional magnetic resonance imaging. Front. Neurosci. 14:598482. doi: 10.3389/fnins.2020.598482
Winawer, J., Horiguchi, H., Sayres, R. A., Amano, K., and Wandell, B. A. (2010). Mapping hV4 and ventral occipital cortex: the venous eclipse. J. Vis. 10:1. doi: 10.1167/10.5.1
Woolsey, C. N., Erickson, T. C., and Gilson, W. E. (1979). Localization in somatic sensory and motor areas of human cerebral cortex as determined by direct recording of evoked potentials and electrical stimulation. J. Neurosurg. 51, 476–506. doi: 10.3171/jns.1979.51.4.0476
Yeshurun, Y., and Sobel, N. (2010a). Multisensory integration: an inner tongue puts an outer nose in context. Nat. Neurosci. 13, 148–149. doi: 10.1038/nn0210-148
Yeshurun, Y., and Sobel, N. (2010b). An odor is not worth a thousand words: from multidimensional odors to unidimensional odor objects. Annu. Rev. Psychol. 61, 219–241. doi: 10.1146/annurev.psych.60.110707.163639
Young, J. P., Herath, P., Eickhoff, S., Choi, J., Grefkes, C., Zilles, K., et al. (2004). Somatotopy and attentional modulation of the human parietal and opercular regions. J. Neurosci. 24, 5391–5399. doi: 10.1523/JNEUROSCI.4030-03.2004
Yu, T., Cai, L. Y., Torrisi, S., Vu, A. T., Morgan, V. L., Goodale, S. E., et al. (2023). Distortion correction of functional MRI without reverse phase encoding scans or field maps. Magn. Reson. Imaging 103, 18–27. doi: 10.1016/j.mri.2023.06.016
Yushu Chen, X. C., Baserdem, B., Zhan, H., Li, Y., Davis, M. B., Kebschull, J. M., et al. (2021). Wiring logic of the early rodent olfactory system revealed by high-throughput sequencing of single neuron projections. BioRxiv [Preprint].
Zeki, S. (2003). Improbable areas in the visual brain. Trends Neurosci. 26, 23–26. doi: 10.1016/S0166-2236(02)00008-5
Zeki, S., and Bartels, A. (1999). Toward a theory of visual consciousness. Conscious. Cogn. 8, 225–259. doi: 10.1006/ccog.1999.0390
Keywords: visual field map, auditory field map, cloverleaf cluster, retinotopy, tonotopy, periodotopy, somatotopy, gustatory
Citation: Brewer AA and Barton B (2023) Cortical field maps across human sensory cortex. Front. Comput. Neurosci. 17:1232005. doi: 10.3389/fncom.2023.1232005
Edited by:
Lidia Alonso-Nanclares, Spanish National Research Council (CSIC), SpainReviewed by:
Reza Farivar-Mohseni, McGill University, CanadaFelix Blankenburg, Free University of Berlin, Germany
Copyright © 2023 Brewer and Barton. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Alyssa A. Brewer, YWFicmV3ZXJAdWNpLmVkdQ==