Skip to main content

EDITORIAL article

Front. Comput. Neurosci., 04 January 2024
This article is part of the Research Topic Advances in Shannon-based Communications and Computations Approaches to Understanding Information Processing In The Brain View all 6 articles

Editorial: Advances in Shannon-based communications and computations approaches to understanding information processing in the brain

  • 1Department of Electrical and Computer Engineering, University of Canterbury, Christchurch, New Zealand
  • 2Department of Engineering “Enzo Ferrari”, University of Modena and Reggio Emilia, Modena, Italy

Shannon's works on Boolean algebra and information theory are the very foundations upon which all modern computations and communications are based (Shannon, 1940, 1948). Here, we called for article submissions that apply these seminal concepts to information processing in the brain. It is worth accentuating the status quo to provide some pertinent context: “We don't know how the brain stores anything, let alone words” (Poeppel and Idsardi, 2022). In addition, there are currently two major competing theories: Hebbian synaptic hypothesis (Mayford et al., 2012; Kandel et al., 2014), and cell-intrinsic hypothesis (Gallistel, 2017, 2021; Gold and Glanzman, 2021; Akhlaghpour, 2022). For an introductory overview, please see Tee and Taylor (2022).

Five articles were accepted and published as part of this Research Topic. Pio-Lopez et al. presented results from four computer simulations and one experimental test to establish an information processing connection between cell biology (i.e., morphogenesis) and neuroscience (i.e., active inference). Gallistel et al. presented a Bayesian algorithm and three experiments on firing pauses of cerebellar Purkinje cells associated with eyeblink conditioning, in support of the cell-intrinsic hypothesis. Ricci et al. presented results from six computer simulations of the cerebellar Purkinje cell using a leaky integrate-and-fire neuron to demonstrate the computational feasibility of the cell-intrinsic hypothesis. Fitch elaborated on the hypothesis that information in the brain is stored not only in the synapse and inside each cell, but also in the dendritic structure and cell-to-cell connectivity. Madani presented the performance of a novel model for learning hierarchical concepts based on an information theoretic score, in comparison with the performance achieved by synaptic-based artificial neural network models and n-gram language models.

While significant progress has been made in the computational aspect of the brain, the communications aspect remains very much unresolved. For example, all communications systems require a modulation scheme to carry the information from its point of origin to a given destination (e.g., information storage and retrieval). What modulation scheme is employed in the brain? For a more detailed examination of the modulation problem, please see Tee and Taylor (2020). To date, the most plausible modulation scheme that falls consistently within Shannon's framework is Interspike Interval (ISI), known in communications literature as Differential Pulse Position Modulation (DPPM) (Berger and Levy, 2010; Tee and Taylor, 2020). What we still do not know is the alphabet size (i.e., the number of distinct symbols) characterizing such a DPPM scheme. Equivalently, what we still do not know is how many bits of information each modulation symbol represents (or conveys). Previous works have concluded that the brain may be computationally representing subjective probabilities with four bits of precision (Tee and Taylor, 2019) and subjective values with five bits of precision (Tee and Taylor, 2021a). How might such computational representation precisions translate into modulational representation precisions? The open-access Timing Database (Aydogan et al., 2023), consisting of 68 datasets from eight different tasks, may be a helpful resource for a deeper exploration into these representation questions.

One common assumption in computational neuroscience is that the brain operates at Shannon's channel capacity (Berger and Levy, 2010; Levy and Calvert, 2021). In communications systems, transmitting a message at a rate arbitrarily close to capacity is theoretically possible (as per Shannon's seminal work). However, in practice, operating at (or near) capacity is not necessarily feasible, partly because this requires the use of advanced Error Correction Codes (ECCs), that transform the message in a longer codeword. For example, in Chung et al. (2001), a Low-Density Parity-Check (LDPC) ECC is shown to closely approach Shannon's capacity limit by using a codeword length equal to 107 bits (i.e., 10 million bits). Such a length requires big data buffers (i.e., memory) to temporarily store the entire message during encoding and the received codeword at the decoder, which can present a storage issue/challenge. Furthermore, the entire codeword needs to be received before the decoding process can begin, resulting in a time delay. For instance, a block of 107 bits transmitted at 1 Mbps takes 10 s to be delivered; even if the transmission rate were 10 Mbps, it would still take 1 s to arrive. Finally, the decoding process may not be computationally trivial and certainly introduces an additional time delay. All these constraints can make error correction encoding and decoding unfeasible to operate near Shannon's capacity if the decoding buffer requirement and time delay are unacceptable. In contrast, Random Access Memory (RAM) in modern computers employs much simpler and shorter ECC, without operating near Shannon's capacity. For example, DDR5 SDRAM employs a codeword length of 136 bits (i.e., 128 data bits and eight ECC parity bits generated using a Hamming code; Rooney and Koyle, 2019), enabling the decoder to operate with a very small buffer and negligible time delays. Given that advanced ECC is needed to operate near Shannon's capacity, the constraints related to the size of the decoding buffer and the maximum time delay make it rather unlikely for the brain to achieve these conditions. Consequently, the common assumption that the brain operates at Shannon channel capacity may need to be revisited. It is also worth noting that the simplest possible ECC that does not require a buffer and has the lowest possible time delay is a repetition code. Such a scheme is, therefore, neurobiologically plausible, especially for real-time processes in the brain, such as visual and auditory systems (Tee and Taylor, 2021b).

We want to thank all authors for their submissions to this Research Topic, and the reviewers for evaluating the submissions. We hope that this Research Topic will provoke some insights and ideas for future research on information processing in the brain. Lastly, we dedicate this Research Topic to the late Desmond P. Taylor (IEEE Communications Society, 2022; Royal Society of New Zealand Te Apārangi, 2022; Zuckerman, 2022).

Author contributions

JT: Writing—original draft, Writing—review & editing. GV: Writing—original draft, Writing—review & editing.

Funding

The author(s) declare that no financial support was received for the research, authorship, and/or publication of this article.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher's note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

Akhlaghpour, H. (2022). An RNA-based theory of natural universal computation. J. Theor. Biol. 537, 1–19. doi: 10.1016/j.jtbi.2021.110984

PubMed Abstract | Crossref Full Text | Google Scholar

Aydogan, T., Karşilar, H., Duyan, Y. A., Akdogan, B., Baccarani, A., Brochard, R., et al. (2023). The timing database: an open-access, live repository for interval timing studies. Behav. Res. 1–11. doi: 10.3758/s13428-022-02050-9

PubMed Abstract | Crossref Full Text | Google Scholar

Berger, T., and Levy, W. B. (2010). A mathematical theory of energy efficient neural computation and communication. IEEE Inf. Theory 56, 852–874. doi: 10.1109/TIT.2009.2037089

Crossref Full Text | Google Scholar

Chung, S.-Y., Forney, G. D., Richardson, T. J., and Urbanke, R. (2001). On the design of low-density parity-check codes within 0.0045 dB of the Shannon limit. IEEE Commun. Lett. 5, 58–60. doi: 10.1109/4234.905935

Crossref Full Text | Google Scholar

Gallistel, C. R. (2017). The coding question. Trends Cogn. Sci. 21, 498–508. doi: 10.1016/j.tics.2017.04.012

Crossref Full Text | Google Scholar

Gallistel, C. R. (2021). The physical basis of memory. Cognition 213, 1–6. doi: 10.1016/j.cognition.2020.104533

Crossref Full Text | Google Scholar

Gold, A. R., and Glanzman, D. L. (2021). The central importance of nuclear mechanisms in the storage of memory. Biochem. Biophys. Res. Commun. 564, 103–113. doi: 10.1016/j.bbrc.2021.04.125

PubMed Abstract | Crossref Full Text | Google Scholar

IEEE Communications Society (2022). In memoriam: Desmond P. (Des) Taylor, 1941-2022. Available online at: https://www.comsoc.org/about/news/memoriams/desmond-p-des-taylor (accessed December 8, 2023).

Google Scholar

Kandel, E. R., Dudai, Y., and Mayford, M. R. (2014). The molecular and systems biology of memory. Cell 157, 163–186. doi: 10.1016/j.cell.2014.03.001

PubMed Abstract | Crossref Full Text | Google Scholar

Levy, W. B., and Calvert, V. G. (2021). Communication consumes 35 times more energy than computation in the human cortex, but both costs are needed to predict synapse number. Proc. Natl. Acad. Sci. USA. 118, e2008173118. doi: 10.1073/pnas.2008173118

PubMed Abstract | Crossref Full Text | Google Scholar

Mayford, M., Siegelbaum, S. A., and Kandel, E. R. (2012). Synapses and memory storage. Cold Spring Harb. Perspect. Biol. 4, 1–18. doi: 10.1101/cshperspect.a005751

Crossref Full Text | Google Scholar

Poeppel, D., and Idsardi, W. (2022). We don't know how the brain stores anything, let alone words. Trends Cogn. Sci. 26, 1054-1055. doi: 10.1016/j.tics.2022.08.010

Crossref Full Text | Google Scholar

Rooney, R., and Koyle, N. (2019). Micron® DDR SDRAM: New Features. Boise, ID: Micron Technology. Available online at: https://media-www.micron.com/-/media/client/global/documents/products/white-paper/ddr5_new_features_white_paper.pdf (accessed December 8, 2023).

Google Scholar

Royal Society of New Zealand Te Apārangi (2022). Desmond Taylor (1941-2022), PhD McMaster FEIC FIPENZ FIEEE FRSNZ. Available online at: https://www.royalsociety.org.nz/who-we-are/our-people/our-fellows/obituaries/fellows-obituaries/desmond-taylor/ (accessed December 8, 2023).

Google Scholar

Shannon, C. E. (1940). A Symbolic Analysis of Relay and Switching Circuits [M.S. thesis]. Cambridge, MA: Dept. of Elec. Eng., Mass. Inst. Tech. Available online at: https://dspace.mit.edu/handle/1721.1/11173 (accessed December 8, 2023).

Google Scholar

Shannon, C. E. (1948). A mathematical theory of communication. Bell Syst. Tech. J. 27, 379–423, 623–656. doi: 10.1002/j.1538-7305.1948.tb01338.x

Crossref Full Text | Google Scholar

Tee, J., and Taylor, D. P. (2019). A quantized representation of probability in the brain.” IEEE Trans. Mol. Biol. Multi-scale Commun. 5, 19–29. doi: 10.1109/TMBMC.2019.2950182

PubMed Abstract | Crossref Full Text | Google Scholar

Tee, J., and Taylor, D. P. (2020). Is information in the brain represented in continuous or discrete form? IEEE Trans. Mol. Biol. Multi-scale Commun. 6, 199–209. doi: 10.1109/TMBMC.2020.3025249

Crossref Full Text | Google Scholar

Tee, J., and Taylor, D. P. (2021a). A quantized representation of intertemporal choice in the brain. IEEE Trans. Mol. Biol. Multi-scale Commun. 7, 1–9. doi: 10.1109/tmbmc.2020.3025244

PubMed Abstract | Crossref Full Text | Google Scholar

Tee, J., and Taylor, D. P. (2021b). What if memory information is stored inside the neuron, instead of in the synapse? arXiv [preprint]. doi: 10.48550/arXiv.2101.09774

Crossref Full Text | Google Scholar

Tee, J., and Taylor, D. P. (2022). Where is memory information stored in the brain? New Sch. Psychol. Bull. 19, 2–13.

Google Scholar

Zuckerman, D. (2022). In memoriam: Desmond P. (Des) Taylor. IEEE Commun. Mag. 60, 6. doi: 10.1109/MCOM.2022.9800138

Crossref Full Text | Google Scholar

Keywords: Shannon, information processing, brain, communication, computation, channel capacity, modulation, representation

Citation: Tee J and Vitetta GM (2024) Editorial: Advances in Shannon-based communications and computations approaches to understanding information processing in the brain. Front. Comput. Neurosci. 17:1352772. doi: 10.3389/fncom.2023.1352772

Received: 08 December 2023; Accepted: 12 December 2023;
Published: 04 January 2024.

Edited and reviewed by: Si Wu, Peking University, China

Copyright © 2024 Tee and Vitetta. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: James Tee, james.tee@canterbury.ac.nz; Giorgio M. Vitetta, vitetta@unimore.it

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.