About this Research Topic
Information-based methods, typically built as extensions of the Shannon Entropy, are at the basis of model-free approaches which, being based on probability distributions rather than on specific expectations, can account for all possible non-linearities present in the data in a model-independent fashion.
Thus, for instance, to compute the statistical dependence between two random variables, the Mutual Information accounts for the information bits that the two variables are sharing (if it is zero, the two variables are statistically independent).
Mutual Information-like methods can also be applied on interacting dynamical variables described by time-series, thus addressing the uncertainty reduction (or information) in one variable by conditioning on another set of variables. This is the spirit of the growing-in-popularity Transfer Entropy (Schreiber 2000), an Information-based method to estimate directed influence.
In the last years, different Information-based methods have been shown to be flexible and powerful tools to analyze neuroimaging data, with a wide range of different methodologies, including formulations-based on bivariate vs multivariate representations, frequency vs time domains, etc. Apart from methodological issues, the information bit as a common unit represents a convenient way to open the road for comparison and integration between different measurements of neuroimaging data in three complementary contexts: Structural Connectivity, Dynamical (Functional and Effective) Connectivity, and Consciousness.
Mutual Information-based methods have provided new insights about common-principles in brain organization, showing the existence of an active default network when the brain is at rest. It is not clear, however, how this default network is generated, the different modules are intra-interacting, or disappearing in the presence of stimulation. Some of these open-questions at the functional level might find their mechanisms on their structural correlates. A key question is the link between structure and function and the use of structural priors for the understanding of the functional connectivity measures.
As effective connectivity is concerned, recently a common framework has been proposed for Transfer Entropy and Granger Causality, a well-established methodology originally based on autoregressive models. This framework can open the way to new theories and applications.
Information flow and transfer in the brain can be straightforwardly associated to consciousness: will the knowledge of the structure and the dynamics lead us to define consciousness? Do different information processing pathways exist in different consciousness states, or is simply the amount of information different? Information based measurements could help to clarify this issue.
A Research Topic bringing together contributions from researchers from different backgrounds which are either developing new approaches, or applying existing methodologies to new data would be an optimal round table and starting platform for the development and validation of new Information-based methodologies for the understanding of brain structure, function, and dynamics.
Important Note: All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.