In the context of communications and computations, Claude E. Shannon is well-known for at least 3 things. First, the source coding theorem (i.e., noiseless coding theorem), which defines the maximum limit of data compression (e.g., minimum number of bits required to represent audio music). Second, the noisy-channel coding theorem, which defines the maximum rate that information can be transmitted almost error-free through a noisy channel (e.g., maximum number of bits per second that the fiber optic broadband internet can transmit). Third, through his MIT master’s thesis, the implementation of Boolean algebra (i.e., AND, OR, NOT, XOR) using electric circuits of relays and switches; this subsequently became the basis of all modern transistor-based computers. Thus, Shannon is the father of both information theory and modern computing. Shannon’s key discoveries on communications and computations serve as the foundational basis for understanding all information processing systems, including the brain.
In all modern computers, communications precede and succeed computation. For example, video data (e.g., MPEG-2) must be transmitted to your computer (e.g., via broadband internet) before the data is processed (e.g., video decoding computations). Even within the computer itself, data must be transmitted between various locations (e.g., hard drive, RAM) before computations can take place (in the microprocessor). Once decoded, the video information must be further transmitted to the computer screen. Both communications and computations aspects are crucial to information processing. The circumstance is similar in the brain. For example, visual information from the eye must be transmitted from rods and cones in the retina to the visual cortex in the occipital lobe before the visual information can be processed (i.e., computed). From there, the processed information is further transmitted to other locations of the brain (e.g., prefrontal cortex). Despite the importance of both computations and communications, neuroscience research has traditionally focused predominantly on computational aspects, with communications largely being omitted. Research momentum and advances have, however, begun to shift very recently. For example, it has now been estimated that communications consume 35 times more energy than computations in the human cortex. Furthermore, a recent exhumation of the long-forgotten discovery that single cell organisms (Paramecium aurelia) are capable of Pavlovian conditioning called into question the widely held Hebbian synaptic hypothesis. This Research Topic aims to spotlight research works that incorporate communications aspects in the brain. Specifically, the goal is to gather recent advances that apply Shannon’s key discoveries on communication and computation to better understand neuronal information processing. Another concurrent goal is to complete this Research Topic by/before early 2023 to commemorate the 75th anniversary of Shannon’s pioneering 1948 paper, “A Mathematical Theory of Communication”.
The scope of this Research Topic covers computations and communications aspects in the brain, based on humans and animal/organism models. On computations, we are open to both school of thoughts, namely, Hebbian synaptic hypothesis (e.g., long-term potentiation) and cell-intrinsic hypothesis (e.g., RNA-based memories and computations), as applied to areas such as perception, cognition, learning, memory, and decision making. On communications, we encourage submissions that cover fundamental aspects of neuronal communications, such as quantization, error control coding, modulation, channel noise, synchronization, inter-symbol interference (equalization), queueing theory (Poisson process, buffering), energy requirements (signal-to-noise ratio), and error rate estimations. In terms of methods, we welcome submissions employing mathematical modeling, computer simulation, data analysis, new hypothesis and theory, new methods (e.g., algorithms), and existing methods from communications systems engineering applied to the brain. In terms of manuscript types, we are interested in original research, methods, review, mini review, hypothesis and theory, perspective, brief research report, and opinion.
In Memoriam: Dr Desmond Taylor (co-topic editor) who sadly passed away during the early stages of this research topic. https://www.comsoc.org/about/news/memoriams/desmond-p-des-taylor
In the context of communications and computations, Claude E. Shannon is well-known for at least 3 things. First, the source coding theorem (i.e., noiseless coding theorem), which defines the maximum limit of data compression (e.g., minimum number of bits required to represent audio music). Second, the noisy-channel coding theorem, which defines the maximum rate that information can be transmitted almost error-free through a noisy channel (e.g., maximum number of bits per second that the fiber optic broadband internet can transmit). Third, through his MIT master’s thesis, the implementation of Boolean algebra (i.e., AND, OR, NOT, XOR) using electric circuits of relays and switches; this subsequently became the basis of all modern transistor-based computers. Thus, Shannon is the father of both information theory and modern computing. Shannon’s key discoveries on communications and computations serve as the foundational basis for understanding all information processing systems, including the brain.
In all modern computers, communications precede and succeed computation. For example, video data (e.g., MPEG-2) must be transmitted to your computer (e.g., via broadband internet) before the data is processed (e.g., video decoding computations). Even within the computer itself, data must be transmitted between various locations (e.g., hard drive, RAM) before computations can take place (in the microprocessor). Once decoded, the video information must be further transmitted to the computer screen. Both communications and computations aspects are crucial to information processing. The circumstance is similar in the brain. For example, visual information from the eye must be transmitted from rods and cones in the retina to the visual cortex in the occipital lobe before the visual information can be processed (i.e., computed). From there, the processed information is further transmitted to other locations of the brain (e.g., prefrontal cortex). Despite the importance of both computations and communications, neuroscience research has traditionally focused predominantly on computational aspects, with communications largely being omitted. Research momentum and advances have, however, begun to shift very recently. For example, it has now been estimated that communications consume 35 times more energy than computations in the human cortex. Furthermore, a recent exhumation of the long-forgotten discovery that single cell organisms (Paramecium aurelia) are capable of Pavlovian conditioning called into question the widely held Hebbian synaptic hypothesis. This Research Topic aims to spotlight research works that incorporate communications aspects in the brain. Specifically, the goal is to gather recent advances that apply Shannon’s key discoveries on communication and computation to better understand neuronal information processing. Another concurrent goal is to complete this Research Topic by/before early 2023 to commemorate the 75th anniversary of Shannon’s pioneering 1948 paper, “A Mathematical Theory of Communication”.
The scope of this Research Topic covers computations and communications aspects in the brain, based on humans and animal/organism models. On computations, we are open to both school of thoughts, namely, Hebbian synaptic hypothesis (e.g., long-term potentiation) and cell-intrinsic hypothesis (e.g., RNA-based memories and computations), as applied to areas such as perception, cognition, learning, memory, and decision making. On communications, we encourage submissions that cover fundamental aspects of neuronal communications, such as quantization, error control coding, modulation, channel noise, synchronization, inter-symbol interference (equalization), queueing theory (Poisson process, buffering), energy requirements (signal-to-noise ratio), and error rate estimations. In terms of methods, we welcome submissions employing mathematical modeling, computer simulation, data analysis, new hypothesis and theory, new methods (e.g., algorithms), and existing methods from communications systems engineering applied to the brain. In terms of manuscript types, we are interested in original research, methods, review, mini review, hypothesis and theory, perspective, brief research report, and opinion.
In Memoriam: Dr Desmond Taylor (co-topic editor) who sadly passed away during the early stages of this research topic. https://www.comsoc.org/about/news/memoriams/desmond-p-des-taylor