Skip to main content

PERSPECTIVE article

Front. Syst. Neurosci., 13 August 2020

Renewed Perspectives on the Deep Roots and Broad Distribution of Animal Consciousness

  • University of Texas at El Paso, El Paso, TX, United States

The vast majority of neurobiologists have long abandoned the Cartesian view of non-human animals as unconscious automatons—acknowledging instead the high likelihood that mammals and birds have mental experiences akin to subjective consciousness. Several lines of evidence are now extending those limits to all vertebrates and even some invertebrates, though graded in degrees as argued originally by Darwin, correlated with the complexity of the animal’s brain. A principal argument for this view is that the function of consciousness is to promote the survival of an animal—especially one actively moving about—in the face of dynamic changes and real-time contingencies. Cognitive ecologists point to the unique features of each animal’s environment and the specific behavioral capabilities that different environments invoke, thereby suggesting that consciousness must take on a great variety of forms, many of which differ substantially from human subjective experience.

Introduction

Ever since Darwin, students of animal behavior and the nervous system have generally regarded consciousness as a product of the brain, subject to the influence of natural selection (Richards, 1987). Just as the complexity of brains varies across the full range of animal taxa, Darwin (1871) and his contemporary, Romanes (1883), argued that cognition has emerged in grades of complexity over the evolutionary history of animals.

Notwithstanding this early biological approach to mental phenomena, the scientific study of consciousness entered into a hiatus during the first half of the 20th century, accentuated by the rise of genetics (Richards, 1987), the turn toward behaviorism in psychology (Griffin, 1984), and some difficult philosophical problems (Nagel, 1974; Chalmers, 1995; Dennett, 2017). In recent years, however, neuroscientists have started applying their increasingly sophisticated techniques to the study of mental activity, and neurobiologists have resumed consideration of what ecology, ethology, and evolutionary history have to suggest about the original contention of Darwin and his contemporaries that consciousness is a function of the brain molded by natural selection (Churchland, 2007; Engel, 2010).

Perhaps because consciousness is so often thought of as a thing instead of a process (Rose, 1976), it most often is expressed as a binary possibility—either it exists or it doesnot (Chaisson, 1987; Humphrey, 1992; LeDoux, 2019). And among those animals capable of it, consideration is seldom given to varying degrees or alternative modes of consciousness. The goal of this article is to briefly review the perspectives from cognitive ecology and comparative neurobiology that suggest a return to the view of Darwin and his contemporaries is warranted, and to point out that if indeed consciousness has been forged through natural selection to be suited uniquely to the environment the animal inhabits, logic suggests that it must be multimodal, occurring in highly variable forms largely remote from human experience.

Definition

Of the modern attempts to define consciousness, I prefer that of John (2003) who described it as “the subjective awareness of momentary experience interpreted in the context of personal memory and the present state.” The terms “awareness” and “experience” are themselves difficult to define without circular references, and “subjective” necessarily invokes phenomenological (personal) experience resistant to objectivization, but all of them can plausibly be attributed to at least some animals with nervous systems of sufficient complexity.

Four features of consciousness deemed irreducible by Feinberg (2012) serve as useful elaborations of John’s definition. He posited that consciousness is: (1) referential, or experienced as occurring outside the head; (2) unified, or perceived as coherent scenes, sensations, events, or emotions; (3) qualitatively variable, consisting of sensory gradations and variations within modalities (as in colors, sounds, intensities, etc.); and (4), causational, in its ability to trigger subsequent mental activity and affect behavior.

To summarize by integrating John’s definition with Feinberg’s features, consciousness is the personal awareness of unified and qualitatively textured current or recalled experience, perceived as existing in the animal’s external or bodily environment, with the capacity to induce further mental activity and/or behavioral action.

Nature and Function

Any argument about the origin and varieties of consciousness must begin with a consideration of its nature and function. From clinical observations and personal experience, human consciousness appears in gradations from marginal awareness to full and focused attention. It should be noted, however, that attention and consciousness are not the same things. Stimuli can be attended to unconsciously, and subjects can be conscious of experiences that they are not attending to (Koch and Tsuchiya, 2007).

In the literature of comparative animal psychology, references to roughly three degrees of consciousness are common. The first degree arises from the detection of physical stimuli at the body’s periphery or interior, capable of eliciting an adaptive reflex. This is referred to as “primary” (Edelman, 2003), or “sensory” (Feinberg and Mallatt, 2016), consciousness. It requires some level of arousal or vigilance to make the animal receptive to stimulation and capable of motor activity but does not necessarily imply rich, textured, or complex content. Some authors consider awareness of feelings or affect as distinguishable from but comparable in degree to sensory consciousness (Feinberg and Mallatt, 2016). The second degree of consciousness is self-awareness (Churchland, 2002; Lou et al., 2020), which Churchland (2013) argued is just another form of perception. The highest degree of consciousness is meta-cognition or the conscious knowledge that individuals have about their cognitive capacities (Smith et al., 2003; Al Banna et al., 2016). The second and third levels of consciousness do imply increasingly complex content. As used in this article, consciousness refers to its primary form (both sensory and affective), unless stated otherwise.

A common approach to discerning the function of consciousness is to focus on the key features of subjective experience: its stabilizing properties and qualitative richness, dynamic integration, situatedness, and intentionality (Pennartz et al., 2019). James (1884), for instance, held that consciousness serves to direct attention and dampen chaotic cerebral activity. He viewed it as a means of focusing on one of several simultaneously possible objects or trains of thought (James, 1890). As he noted, “My experience is what I agree to attend to” (James, 1884). A modern version of the same view was expressed by Edelman (2003) that “the neural systems underlying consciousness arose to enable high-order discriminations in a multidimensional space of signals, ” and qualitative differences in content (qualia) provide a basis for those discriminations.

Many authors have seen the integrative nature of consciousness as central to its function (Edelman, 2003; Tononi, 2012; Miyahara and Witkowski, 2019). In some cases, the integration is directed toward managing the multiplicity of incoming sensory information. In others, it is seen as essential for accessing memory stores with which it evaluates the context and significance of incoming information. Edelman saw consciousness as the integration of perceptual and motor events together with memory to construct a multimodal scene in the present (Edelman, 2003). For Pennartz et al. (2019), the biological function of consciousness is to present the subject with a multimodal, situational survey of the surrounding world and body, subserving complex decision-making, and goal-directed behavior.

Consciousness is particularly important for evaluating real-time changes in the situation of a mobile animal behaving appropriately in its environment (Merker, 2005). Indeed, the essence of consciousness for some authors revolves around its role in evaluating appropriate actions to take in given circumstances, anticipating the consequences of those actions, and updating the animal’s position and orientation as it moves through space (Churchland, 2002; Engel, 2010; Clark, 2016; Buzsáki, 2019). Griffin (1984) asserted that an animal is conscious if it is aware of what it is doing or intending to do. As an appreciation for the central role of place in the cognitive landscape has grown (Irwin and Irwin, 2020), so has a realization that movement through its milieu is largely how an animal mentally creates the dimensions of its environment (Merleau-Ponty, 1945; Sheets-Johnstone, 1999).

Every hypothesis about the function of consciousness has at its core the view that it enables the animal to make optimal use of the information available to it, which has obvious survival value. This is not to assert that consciousness is necessary for every beneficial activity—adaptive behaviors from the tropisms of unicellular microbes to complex instinctual behaviors in vertebrates—can occur without conscious reflection. But the ability to maximize the utility of information, integrate it with memory, and focus on elements most critical for survival, either in the moment or over a longer-term, is a biological capability susceptible to favorable natural selection.

Cognitive Ecology

Jakob Von Uexküll (1926) was an early proponent of the argument that organisms experience life in terms of species-specific, spatio-temporal, “self-in-world” frames of reference. An organism creates and reshapes its perception. Consequently, the minds of different organisms differ, which follows from the individuality and uniqueness of the history of every single organism. Gibson (1977) coined the term affordance to denote what the environment uniquely makes possible for any given species. Many observers have noted the need for an animal’s perception and cognition to match the affordances of its environment. Gallagher (1997) argued that “Consciousness and the brain develop and function within a form of existence that is already defined by the world it inhabits.” Anderson (2014) observed that neural organization should be understood in terms of “the brain’s differential propensities to influence the organism’s response to the various features or affordances in its environment.”

There is strong evidence that brains have evolved to respond to environmental pressures (Marino, 2005). Various measures of brain size are positively correlated with: (1) feeding innovation, learning, and tool use; (2) size of behavioral repertoire; (3) social complexity; (4) dietary complexity; and (5) unpredictability of the environment. Ecological principles like the unpredictability of resources in space and time may drive different types of cognition (Lefebvre and Sol, 2008).

An alternative to the influence of environmental variation is that the complexity of the environment in general rather than the unique features of different environments determines the nature of consciousness (Shumway, 2008; Sol, 2009; Mettke-Hofmann, 2014). However, brain evolution does not necessarily proceed from simple to complex, small to large, or diffuse to differentiated (Bullock, 1984; Kaas, 2002). Marino (2005) believes that brain evolution relates more to environmental change than to its complexity. On the other hand, neural and behavioral complexity are clearly correlated (Parker and McKinney, 1999; Neubauer, 2012). So the role that complex environments play in shaping the nature of consciousness cannot be dismissed. Nonetheless, a symposium on the evolution of neurobiological specializations in mammals concluded that there are no simple brains; that brains reflect ecology (Marino and Hof, 2005). Ethologists and comparative animal psychologists have echoed this perspective. Hodos (1986), for instance, argued that natural selection optimizes mechanisms of perception maximally appropriate for the ecological demands of each species, rather than the complexity of information processing in a general sense.

Not everyone has embraced an emphasis on ecological determinants of cognition. Macphail and Bolhuis (2001) claimed to find no convincing support from an ecological account of cognition. In fact, their argument seems to focus on mechanisms of learning specifically more than on consciousness (Bolhuis, 2015). The fact is that all animals exist in an ecological setting, and to the extent that natural selection responds to ecological mandates (Marino, 2005; Sol et al., 2005), it is reasonable to assume that the nature of an animal’s conscious awareness must be attuned to its environmental setting.

Neurobiology

While evolutionary “gradations” in consciousness have been advocated since the time of Darwin (1871) and Romanes (1883), consciousness has usually been discussed as a bimodal phenomenon, with some evolutionary threshold needing to be achieved for consciousness in an-all-or-none fashion to appear (Jaynes, 1976; Humphrey, 1992; LeDoux, 2019). But neuroanatomy, neurochemistry, and adaptive behavior are not discontinuous over evolutionary time scales, so the sudden emergence of consciousness where it did not previously exist is not plausible. Rather, a gradual increase in resolution and complexity of awareness, coupled with increasing integration of perception with memory as brains became larger and more complex during phylogeny, is logically more persuasive (Rose, 1976; Bullock, 1984; Griffin, 1984; Tononi, 2004; Tononi and Koch, 2015; Mercado, 2008; Koch, 2012; Kiverstein and Rietveld, 2018).

This does not mean that discontinuities in the degree of consciousness do not exist across some phyletic lines. Stark differences in the complexity of consciousness between invertebrates and vertebrates, birds and mammals compared to the other vertebrate Classes, and especially between humans and non-human primates, are generally assumed. What neurobiological evidence can be advanced in support of any of these assumptions?

Feinberg and Mallatt (2016) have provided the most detailed and sophisticated analysis of the occurrence of consciousness across the animal kingdom. They posit seven indicators of sensory consciousness (a large number of neurons, three or more synaptic relays before the pre-motor center, isomorphic organization, reciprocal interactions, multisensory convergence, a neuroanatomical locus for memory, and a selective attention mechanism), as well as five indicators of affective consciousness (operantly learned response to reward or punishment, the behavioral distinction between good and bad outcomes, frustration behavior, self-delivery of analgesics, and approach to reinforcing drugs). Using these criteria, Feinberg and Mallatt (2016) propose the likelihood that some arthropods (especially among the insects and crustaceans), the cephalopods (octopi, squids, nautiloids), and all vertebrates are conscious. While others—especially those focused on the assessment of meta-cognition in mammals and birds—are more skeptical (Humphrey, 1992; Butler, 2008; LeDoux, 2019), the capacity of all vertebrates to experience primary and affective consciousness has widespread support (Griffin, 1984; Koch, 2012; Feinberg and Mallatt, 2016).

Because the neurobiology of all the vertebrates is known in much greater detail than for the invertebrates, distinctions among the vertebrates can be discerned. For example, brain mass relative to body mass clusters similarly among bony fishes, amphibians, and reptiles, with birds and mammals together at a higher level, and cartilaginous fishes in between those two groups. Neuron numbers exist in the tens of millions for the ectothermic Classes, hundreds of millions in most mammals, and billions in most primates (Herculano-Houzel, 2017). To the extent that neural complexity can be related to degrees of consciousness, the range for both is great across the vertebrates overall.

The waking state, and therefore the capacity for awareness of the environment, has long been recognized as dependent on activation of a diffuse network of neurons emanating from the ancestral brain stem of vertebrates (Moruzzi and Magoun, 1949). This circuitry appears to be necessary for sensory awareness, but does not in itself provide the content of experience; that is assumed to be obtained from perceptions formulated in various regions of the cerebral cortex, which also holds distributed memory (Koch, 2012) and is reciprocally connected to various nuclei in the thalamus.

The neural substrate for complex cognitive functions that are associated with higher-level consciousness in mammals and birds are based on patterns of circuitry that are substantially less elaborated, with some components actually lacking, in reptiles, while the major thalamopallial circuits involving sensory relay nuclei are conspicuously absent in amphibians (Butler and Cotterill, 2006). Based on these criteria, the potential for higher-level consciousness in reptiles and amphibians appears to be lower than in birds and mammals.

A plausible blend of the above observations is that all vertebrates have the capacity for primary (sensory and affective) consciousness, but that the qualitative resolution and detail of conscious processes have risen as neocortical cell numbers have increased across different taxa.

While neural complexity may be marginal in insects (~1 million neurons in bees), many display behavior that goes well beyond simple reflexes and conditioned responses (Menzel, 2012; Giurfa, 2013; Chittka and Wilson, 2019). The complexity of the cephalopod brain, with >100 million neurons, exceeds that of frogs (Godfrey-Smith, 2016). Many arthropods and some cephalopods have complex neural regions involved in learning (hence in-memory storage)—“mushroom bodies” in insects, and the vertical lobe in octopi and squid. The behavior of arthropods and cephalopods is highly variable, in keeping with the variety of environments the animals occupy. Ecological psychologists argue that behavior—hence the nature of consciousness that is presumed to be needed to make the behavior possible—varies with the environmental constraints on each animal’s survival. Therefore, sensory consciousness may be created by diverse neural architectures, often through convergent evolution (Kaas, 2002; Emery and Clayton, 2004; Emery, 2006; Lisney and Collin, 2006; Lefebvre and Sol, 2008; Feinberg and Mallatt, 2016). It also follows, however, that the nature of that consciousness may vary greatly from that experienced by humans (Nagel, 1974).

Ancestral vertebrates and arthropods were present in the early Cambrian, 510 million years ago (mya). The earliest cephalopods date from about 490 mya. If primary consciousness was present in some members of all three groups, which are very distantly related, the origin of consciousness is ancient and, in all probability, evolved independently (Feinberg and Mallatt, 2016).

Discussion

Modern science avoids subjectivity as much as possible, so the phenomenological, or subjective, nature of personal experience has always been the primary challenge to scientific research on consciousness. To that has been added centuries of introspection that treated consciousness as an entity, but one lacking material substance. Yet other subjective aspects of mental activity, like perception, emotion, learning, and dreaming have advanced progressively toward neuroscientific illumination. If consciousness were viewed more like those phenomena, as a process, perhaps the challenge to its scientific study would not seem so formidable.

The adaptive nature of consciousness, which focuses on the intimate interaction between body, brain, and environment, provides an ecological and evolutionary platform for the study of consciousness familiar to all biologists. The brain, and therefore the potential for consciousness, develops and functions in every species within a form of existence that is defined by the world it inhabits (Gallagher, 1997).

This article has argued that consciousness is more widespread than is generally believed, is generated by a diversity of neural substrates, has evolved independently several times, and appears over a range of degrees in a variety of forms (Feinberg and Mallatt, 2016). Technological advances in monitoring and visualizing brain activity in humans will surely illuminate human consciousness in ever-growing detail (Varela, 1996; Crick and Koch, 1998; Seth et al., 2006; Massimini et al., 2009; van Vugt et al., 2018). By superimposing the known electrophysiological correlates of place perception and motor activity (O’Keefe, 1990; Finkelstein et al., 2016; Moser et al., 2017) with established neuro indicators of awareness (Moruzzi and Magoun, 1949; Frith, 2002; Hameroff, 2010), a deeper understanding of the interaction between motor control and consciousness will be achieved (Stocker, 2016).

More attention should now be turned as well to the study of other animals in their natural environments, combining ethological techniques developed over the decades with increasingly sophisticated neuroscience methodologies (Morris, 2005; Gallagher and Zahavi, 2008; Boly et al., 2013; Reiter et al., 2017; Pennartz et al., 2019), to reveal indicators of consciousness in other species. Several other strategic and experimental approaches for assessing comparative cognition (including consciousness) are proposed in Irwin and Irwin (2020).

Author Contributions

The author confirms being the sole contributor of this work and has approved it for publication.

Conflict of Interest

The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

Al Banna, M., Redha, N. A., Abdulla, F., Nair, B., and Donnellan, C. (2016). Metacognitive function poststroke: a review of definition and assessment. J. Neurol. Neurosurg. Psychiatry 87, 161–166. doi: 10.1136/jnnp-2015-310305

PubMed Abstract | CrossRef Full Text | Google Scholar

Anderson, M. L. (2014). After Phrenology: Neural Reuse and the Interactive Brain. Cambridge, MA: MIT Press.

Google Scholar

Bolhuis, J. J. (2015). Evolution cannot explain how minds work. Behav. Processes 117, 82–91. doi: 10.1016/j.beproc.2015.06.008

PubMed Abstract | CrossRef Full Text | Google Scholar

Boly, M., Seth, A. K., Wilke, M., Ingmundson, P., Baars, B., Laureys, S., et al. (2013). Consciousness in humans and non-human animals: recent advances and future directions. Front. Psychol. 4:625. doi: 10.3389/fpsyg.2013.00625

PubMed Abstract | CrossRef Full Text | Google Scholar

Bullock, T. H. (1984). The application of scientific evidence to the issues of use of animals in research: the evolutionary dimension in the problem of animal awareness. IBRO News 12, 9–11.

Google Scholar

Butler, A. B. (2008). Evolution of brains, cognition, and consciousness. Brain Res. Bull. 75, 442–449. doi: 10.1016/j.brainresbull.2007.10.017

PubMed Abstract | CrossRef Full Text | Google Scholar

Butler, A. B., and Cotterill, R. M. (2006). Mammalian and avian neuroanatomy and the question of consciousness in birds. Biol. Bull. 211, 106–127. doi: 10.2307/4134586

PubMed Abstract | CrossRef Full Text | Google Scholar

Buzsáki, G. (2019). The Brain from Inside Out. New York, NY: Oxford University Press.

Google Scholar

Chaisson, E. (1987). The Life Era: Cosmic Selection and Conscious Evolution (Authors Guild Backprint.com ed.). Boston: Atlantic Monthly Press.

Google Scholar

Chalmers, D. J. (1995). Facing up to the problem of consciousness. J. Consciousness Stud. 2, 200–219.

Google Scholar

Chittka, L., and Wilson, C. (2019). Expanding consciousness. Amer. Sci. 107, 364–369. doi: 10.1511/2019.107.6.364

CrossRef Full Text | Google Scholar

Churchland, P. S. (2002). Self-representation in nervous systems. Science 296, 308–310. doi: 10.1126/science.1070564

PubMed Abstract | CrossRef Full Text | Google Scholar

Churchland, P. S. (2007). Neurophilosophy: the early years and new directions. Funct. Neurol. 22, 185–195.

PubMed Abstract | Google Scholar

Churchland, P. M. (2013). Matter and Consciousness. 3rd Edn. Cambridge, MA: MIT Press.

Google Scholar

Clark, A. (2016). Surfing Uncertainty: Prediction, Action, and the Embodied Mind New York: Oxford University Press.

Google Scholar

Crick, F., and Koch, C. (1998). Consciousness and neuroscience. Cereb. Cortex 8, 97–107. doi: 10.1093/cercor/8.2.97

PubMed Abstract | CrossRef Full Text | Google Scholar

Darwin, C. (1871). The Descent of Man. 2nd Edn. New York: A. L. Burt.

Google Scholar

Dennett, D. C. (2017). From Bacteria to Bach and Back. New York, NY: W.W. Norton and Co.

Google Scholar

Edelman, G. M. (2003). Naturalizing consciousness: a theoretical framework. Proc. Natl. Acad. Sci. U S A 100, 5520–5524. doi: 10.1073/pnas.0931349100

PubMed Abstract | CrossRef Full Text | Google Scholar

Emery, N. J. (2006). Cognitive ornithology: the evolution of avian intelligence. Philos. Trans. R. Soc. Lond. B Biol. Sci. 361, 23–43. doi: 10.1098/rstb.2005.1736

PubMed Abstract | CrossRef Full Text | Google Scholar

Emery, N. J., and Clayton, N. S. (2004). The mentality of crows: convergent evolution of intelligence in corvids and apes. Science 306, 1903–1907. doi: 10.1126/science.1098410

PubMed Abstract | CrossRef Full Text | Google Scholar

Engel, A. K. (2010). “Directive minds: how dynamics shapes cognition,” in Enaction: Toward a New Paradigm for Cognitive Science, eds J. Stewart, O. Gapenne and E. A. Di Paolo (Cambridge, MA: MIT Press), 219–243.

Google Scholar

Feinberg, T. E. (2012). Neuroontology, neurobiological naturalism, and consciousness: a challenge to scientific reduction and a solution. Phys. Life Rev. 9, 13–34. doi: 10.1016/j.plrev.2011.10.019

PubMed Abstract | CrossRef Full Text | Google Scholar

Feinberg, T. E., and Mallatt, J. M. (2016). The Ancient Origins of Consciousness: How the Brain Created Experience. Cambridge, MA: MIT Press.

Google Scholar

Finkelstein, A., Las, L., and Ulanovsky, N. (2016). 3-D maps and compasses in the brain. Annu. Rev. Neurosci. 39, 171–196. doi: 10.1146/annurev-neuro-070815-013831

PubMed Abstract | CrossRef Full Text | Google Scholar

Frith, C. (2002). Attention to action and awareness of other minds. Conscious. Cogn. 11, 481–487. doi: 10.1016/s1053-8100(02)00022-3

PubMed Abstract | CrossRef Full Text | Google Scholar

Gallagher, S. (1997). Mutual enlightenment: recent phenomenology in cognitive science. J. Consciousness Stud. 4, 195–214.

Google Scholar

Gallagher, S., and Zahavi, D. (2008). The Phenomenological Mind: An Introduction to Philosophy of Mind and Cognitive Science. 1st Edn. New York, NY: Routledge.

Google Scholar

Gibson, J. J. (1977). “The theory of affordances,” in Perceiving, Acting and Knowing: Toward an Ecological Psychology, eds R. Shaw and J. Bransford (Hillsdale, NJ: Lawrence Erlbaum Associates), 67–82.

Google Scholar

Giurfa, M. (2013). Cognition with few neurons: higher-order learning in insects. Trends Neurosci. 36, 285–294. doi: 10.1016/j.tins.2012.12.011

PubMed Abstract | CrossRef Full Text | Google Scholar

Godfrey-Smith, P. (2016). Other Minds: The Octopus, the Sea, and the Deep Origins of Consciousness. New York, NY: Farrar, Straus and Giroux.

Google Scholar

Griffin, D. R. (1984). Animal Thinking. Cambridge, MA: Harvard University Press.

Google Scholar

Hameroff, S. (2010). The “conscious pilot”-dendritic synchrony moves through the brain to mediate consciousness. J. Biol. Phys. 36, 71–93. doi: 10.1007/s10867-009-9148-x

PubMed Abstract | CrossRef Full Text | Google Scholar

Herculano-Houzel, S. (2017). The Human Advantage: How Our Brains Became Remarkable. Cambridge, MA: MIT Press.

Google Scholar

Hodos, W. (1986). “The evolution of the brain and the nature of animal intelligence,” in Animal Intelligence: Insights into the Animal Mind, eds R. Hoag and L. Goldman (Washington, DC: Smithsonian Press), 77–87.

Google Scholar

Humphrey, N. (1992). A History of the Mind: Evolution and the Birth of Consciousness. New York, NY: Copernicus Springer-Verlag.

Google Scholar

Irwin, L. N., and Irwin, B. A. (2020). Place and environment in the ongoing evolution of cognitive neuroscience. J. Cogn. Neurosci. doi: 10.1162/jocn_a_01607 [Epub ahead of print].

PubMed Abstract | CrossRef Full Text | Google Scholar

James, W. (1884). On some omissions of introspective psychology. Mind 9, 1–26. doi: 10.1093/mind/os-ix.33.1

CrossRef Full Text | Google Scholar

James, W. (1890). Principles of Psychology. New York, NY: Henry Holt.

Google Scholar

Jaynes, J. (1976). The Origins of Consciousness in the Breakdown of the Bicameral Mind. Boston: Houghton Mifflin.

Google Scholar

John, E. R. (2003). A theory of consciousness. Curr. Dir. Psychol. Sci. 12, 244–250. doi: 10.1046/j.0963-7214.2003.01271.x

CrossRef Full Text | Google Scholar

Kaas, J. (2002). Convergences in the modular and areal organization of the forebrain of mammals: implications for the reconstruction of forebrain evolution. Brain Behav. Evol. 59, 262–272. doi: 10.1159/000063563

PubMed Abstract | CrossRef Full Text | Google Scholar

Kiverstein, J. D., and Rietveld, E. (2018). Reconceiving representation-hungry cognition: an ecological-enactive proposal. Adapt. Behav. 26, 147–163. doi: 10.1177/1059712318772778

PubMed Abstract | CrossRef Full Text | Google Scholar

Koch, C. (2012). Consciousness: Confessions of a Romantic Reductionist. Cambridge, MA: MIT Press.

Google Scholar

Koch, C., and Tsuchiya, N. (2007). Attention and consciousness: two distinct brain processes. Trends Cogn. Sci. 11, 16–22. doi: 10.1016/j.tics.2006.10.012

PubMed Abstract | CrossRef Full Text | Google Scholar

LeDoux, J. (2019). The Deep History of Ourselves: The Four-Billion-Year Story of How We Got Conscious Brains. New York, NY: Viking.

Google Scholar

Lefebvre, L., and Sol, D. (2008). Brains, lifestyles and cognition: are there general trends? Brain Behav. Evol. 72, 135–144. doi: 10.1159/000151473

PubMed Abstract | CrossRef Full Text | Google Scholar

Lisney, T. J., and Collin, S. P. (2006). Brain morphology in large pelagic fishes: a comparison between sharks and teleosts. J. Fish Biol. 68, 532–554. doi: 10.1111/j.0022-1112.2006.00940.x

CrossRef Full Text | Google Scholar

Lou, H. C., Rømer Thomsen, K., and Changeux, J.-P. (2020). The molecular organization of self-awareness: paralimbic dopamine-GABA interaction. Front. Syst. Neurosci. 14:3. doi: 10.3389/fnsys.2020.00003

PubMed Abstract | CrossRef Full Text | Google Scholar

Macphail, E. M., and Bolhuis, J. J. (2001). The evolution of intelligence: adaptive specializations versus general process. Biol. Rev. Camb. Philos. Soc. 76, 341–364. doi: 10.1017/s146479310100570x

PubMed Abstract | CrossRef Full Text | Google Scholar

Marino, L. (2005). Big brains do matter in new environments. Proc. Natl. Acad. Sci. U S A 102, 5306–5307. doi: 10.1073/pnas.0501695102

PubMed Abstract | CrossRef Full Text | Google Scholar

Marino, L., and Hof, P. R. (2005). Nature’s experiments in brain diversity. Anat. Rec. A Discov. Mol. Cell. Evol. Biol. 287, 997–1000. doi: 10.1002/ar.a.20261

PubMed Abstract | CrossRef Full Text | Google Scholar

Massimini, M., Boly, M., Casali, A., Rosanova, M., and Tononi, G. (2009). A perturbational approach for evaluating the brain’s capacity for consciousness. Prog. Brain Res. 177, 201–214. doi: 10.1016/s0079-6123(09)17714-2

PubMed Abstract | CrossRef Full Text | Google Scholar

Menzel, R. (2012). The honeybee as a model for understanding the basis of cognition. Nat. Rev. Neurosci. 13, 758–768. doi: 10.1038/nrn3357

PubMed Abstract | CrossRef Full Text | Google Scholar

Mercado, E. III. (2008). Neural and cognitive plasticity: from maps to minds. Psychol. Bull. 134, 109–137. doi: 10.1037/0033-2909.134.1.109

PubMed Abstract | CrossRef Full Text | Google Scholar

Merker, B. (2005). The liabilities of mobility: a selection pressure for the transition to consciousness in animal evolution. Conscious. Cogn. 14, 89–114. doi: 10.1016/s1053-8100(03)00002-3

PubMed Abstract | CrossRef Full Text | Google Scholar

Merleau-Ponty, M. (1945). Phénoménologie de la Perception (Phenomenology of Perception) (S. C., Trans.). London: Routledge and Kegan Paul.

Google Scholar

Mettke-Hofmann, C. (2014). Cognitive ecology: ecological factors, life-styles, and cognition. Wiley Interdiscip. Rev. Cogn. Sci. 5, 345–360. doi: 10.1002/wcs.1289

PubMed Abstract | CrossRef Full Text | Google Scholar

Miyahara, K., and Witkowski, O. (2019). The integrated structure of consciousness: phenomenal content, subjective attitude, and noetic complex. Phenom. Cogn. Sci. 18, 731–758. doi: 10.1007/s11097-018-9608-5

CrossRef Full Text | Google Scholar

Morris, D. (2005). Animals and humans, thinking and nature. Phenom. Cogn. Sci. 4, 49–72. doi: 10.1007/s11097-005-4257-x

CrossRef Full Text | Google Scholar

Moruzzi, G., and Magoun, H. W. (1949). Brain stem reticular formation and activation of the EEG. Electroencephalogr. Clin. Neurophysiol. 1, 455–473. doi: 10.1016/0013-4694(49)90219-9

PubMed Abstract | CrossRef Full Text | Google Scholar

Moser, E. I., Moser, M. B., and McNaughton, B. L. (2017). Spatial representation in the hippocampal formation: a history. Nat. Neurosci. 20, 1448–1464. doi: 10.1038/nn.4653

PubMed Abstract | CrossRef Full Text | Google Scholar

Nagel, T. (1974). What is it like to be a bat? Philos. Rev. 83, 435–450. doi: 10.2307/2183914

CrossRef Full Text | Google Scholar

Neubauer, R. (2012). Evolution and the Emergent Self: The Rise of Complexity and Behavioral Versatility in Nature. New York, NY: Columbia University Press.

Google Scholar

O’Keefe, J. (1990). A computational theory of the hippocampal cognitive map. Prog. Brain Res. 83, 301–312. doi: 10.1016/s0079-6123(08)61258-3

PubMed Abstract | CrossRef Full Text | Google Scholar

Parker, S., and McKinney, M. (1999). Origins of Intelligence: The Evolution of Cognitive Develpment in Monkeys, Apes, and Humans. Baltimore: Johns Hopkins University Press.

Google Scholar

Pennartz, C. M. A., Farisco, M., and Evers, K. (2019). Indicators and criteria of consciousness in animals and intelligent machines: an inside-out approach. Front. Syst. Neurosci. 13:25. doi: 10.3389/fnsys.2019.00025

PubMed Abstract | CrossRef Full Text | Google Scholar

Reiter, S., Liaw, H. P., Yamawaki, T. M., Naumann, R. K., and Laurent, G. (2017). On the value of reptilian brains to map the evolution of the hippocampal formation. Brain Behav. Evol. 90, 41–52. doi: 10.1159/000478693

PubMed Abstract | CrossRef Full Text | Google Scholar

Richards, R. J. (1987). Darwin and the Emergence of Evolutionary Theories of Mind and Behavior. Chicago: University of Chicago Press.

Google Scholar

Romanes, G. J. (1883). Mental Evolution in Animals. London: Kegan Paul, Trench.

Google Scholar

Rose, S. P. R. (1976). The Conscious Brain (Updated ed.) New York, NY: Vintage.

Google Scholar

Seth, A. K., Izhikevich, E., Reeke, G. N., and Edelman, G. M. (2006). Theories and measures of consciousness: an extended framework. Proc. Natl. Acad. Sci. U S A 103, 10799–10804. doi: 10.1073/pnas.0604347103

PubMed Abstract | CrossRef Full Text | Google Scholar

Sheets-Johnstone, M. (1999). The Primacy of Movement Vol. 14. Amsterdam: John Benjamins Publishing.

Google Scholar

Shumway, C. A. (2008). Habitat complexity, brain, and behavior. Brain Behav. Evol. 72, 123–134. doi: 10.1159/000151472

PubMed Abstract | CrossRef Full Text | Google Scholar

Smith, J. D., Shields, W. E., and Washburn, D. A. (2003). The comparative psychology of uncertainty monitoring and metacognition. Behav. Brain Sci. 26, 317–339; discussion 340–373. doi: 10.1017/s0140525x03000086

PubMed Abstract | CrossRef Full Text | Google Scholar

Sol, D. (2009). Revisiting the cognitive buffer hypothesis for the evolution of large brains. Biol. Lett. 5, 130–133. doi: 10.1098/rsbl.2008.0621

PubMed Abstract | CrossRef Full Text | Google Scholar

Sol, D., Duncan, R. P., Blackburn, T. M., Cassey, P., and Lefebvre, L. (2005). Big brains, enhanced cognition, and response of birds to novel environments. Proc. Natl. Acad. Sci. U S A 102, 5460–5465. doi: 10.1073/pnas.0408145102

PubMed Abstract | CrossRef Full Text | Google Scholar

Stocker, K. (2016). Place cells and human consciousness: a force-dynamic account. J. Consciousness Stud. 23, 146–165.

Google Scholar

Tononi, G. (2004). An information integration theory of consciousness. BMC Neurosci. 5:42. doi: 10.1186/1471-2202-5-42

PubMed Abstract | CrossRef Full Text | Google Scholar

Tononi, G. (2012). Integrated information theory of consciousness: an updated account. Arch. Ital. Biol. 150, 56–90. doi: 10.4449/aib.v149i5.1388

PubMed Abstract | CrossRef Full Text | Google Scholar

Tononi, G., and Koch, C. (2015). Consciousness: here, there and everywhere? Philos. Trans. R. Soc. Lond. B Biol. Sci. 370:20140167. doi: 10.1098/rstb.2014.0167

PubMed Abstract | CrossRef Full Text | Google Scholar

van Vugt, B., Dagnino, B., Vartak, D., Safaai, H., Panzeri, S., Dehaene, S., et al. (2018). The threshold for conscious report: signal loss and response bias in visual and frontal cortex. Science 360, 537–542. doi: 10.1126/science.aar7186

PubMed Abstract | CrossRef Full Text | Google Scholar

Varela, F. (1996). Neurophenomenology: a methodological remedy for the hard problem. J. Consciousness Stud. 3, 330–349.

Google Scholar

Von Uexküll, J. (1926). Theoretical Biology. New York, NY: Harcourt, Brace and Co.

Google Scholar

Keywords: cognition, awareness, evolution, arthropods, cephalopods, vertebrates

Citation: Irwin LN (2020) Renewed Perspectives on the Deep Roots and Broad Distribution of Animal Consciousness. Front. Syst. Neurosci. 14:57. doi: 10.3389/fnsys.2020.00057

Received: 03 March 2020; Accepted: 20 July 2020;
Published: 13 August 2020.

Edited by:

Christopher I. Petkov, Newcastle University, United Kingdom

Reviewed by:

David L. Sheinberg, Brown University, United States
Michael Schmid, Newcastle University, United Kingdom

Copyright © 2020 Irwin. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Louis N. Irwin, bGlyd2luQHV0ZXAuZWR1

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.