Neural network models were exploited in the late 80s and 90s to model human cognition, based on developments such as the back propagation learning algorithm. Over the last decade, the availability of massive data sets, enhanced computational resources, and new developments in algorithms have led to explosive growth in the use of neural networks in machine learning and artificial intelligence. These contemporary neural network models now display state of the art performance in artificial intelligence applications, such as speech comprehension, image recognition, and automatic translation. In parallel, artificial neural networks increasingly generate quantitative predictions that make detailed contact with human or animal behavioral or neural measurements. However, there has been very little influence of these advances on cognitive science: Many recent neural network models of cognition are still limited to small synthetic domains and use architectures very similar to those from the late 20th century. The excellent performance of contemporary neural networks on real-world AI tasks suggests that cognitive science could benefit from investigating the application of such systems to model human cognition.
This Research Topic aims to further the continued application of neural network models to address topics in cognitive science and cognitive neuroscience. The topic aims to highlight how the latest developments in neural network models can be applied to understanding cognitive and neural processes and to stimulate further cognitive computational research using the tools, methods, and insights from contemporary neural network research in machine learning and artificial intelligence, while also providing a forum for the presentation of more traditional modeling efforts of high quality. It shares these objectives with the 15th Neural Computation and Psychology Workshop (held at Drexel University, Philadelphia, August 8-9, 2016), to which the Research Topic is a follow-up. Submissions that report new advances in neural network architectures or methods, apply neural networks to more life-like cognitive tasks and larger-scale data sets, or bridge across different levels of analysis, will be particularly encouraged. Most papers will present research using neural networks to simulate cognitive processes such as perception, memory, and language; and to account for behavioral and/or neuroimaging data. Methods, Protocols, and Technology Reports that describe how to implement neural networks will also be welcome. In addition, we expect to include theoretical work focusing on computational aspects of neural networks and their properties (insofar as these are directly of interest to cognitive science), as well as a small number of review papers.
Submitted abstracts will be reviewed by the Topic Editors both for their overall quality and for their relevance to the themes of the Research Topic, which emphasize contemporary neural network architectures, application to large-scale data, novel theoretical developments, and close engagement with behavioral or neural measures. Frontiers hosts several article types, any of which could be appropriate. Please include in the abstract a statement on the proposed article type. Authors of accepted abstracts will be invited to submit a manuscript for peer review. Acceptance of the abstract in no way implies that the manuscript will be accepted for publication. Note that only manuscripts based on accepted abstracts can be submitted.
Neural network models were exploited in the late 80s and 90s to model human cognition, based on developments such as the back propagation learning algorithm. Over the last decade, the availability of massive data sets, enhanced computational resources, and new developments in algorithms have led to explosive growth in the use of neural networks in machine learning and artificial intelligence. These contemporary neural network models now display state of the art performance in artificial intelligence applications, such as speech comprehension, image recognition, and automatic translation. In parallel, artificial neural networks increasingly generate quantitative predictions that make detailed contact with human or animal behavioral or neural measurements. However, there has been very little influence of these advances on cognitive science: Many recent neural network models of cognition are still limited to small synthetic domains and use architectures very similar to those from the late 20th century. The excellent performance of contemporary neural networks on real-world AI tasks suggests that cognitive science could benefit from investigating the application of such systems to model human cognition.
This Research Topic aims to further the continued application of neural network models to address topics in cognitive science and cognitive neuroscience. The topic aims to highlight how the latest developments in neural network models can be applied to understanding cognitive and neural processes and to stimulate further cognitive computational research using the tools, methods, and insights from contemporary neural network research in machine learning and artificial intelligence, while also providing a forum for the presentation of more traditional modeling efforts of high quality. It shares these objectives with the 15th Neural Computation and Psychology Workshop (held at Drexel University, Philadelphia, August 8-9, 2016), to which the Research Topic is a follow-up. Submissions that report new advances in neural network architectures or methods, apply neural networks to more life-like cognitive tasks and larger-scale data sets, or bridge across different levels of analysis, will be particularly encouraged. Most papers will present research using neural networks to simulate cognitive processes such as perception, memory, and language; and to account for behavioral and/or neuroimaging data. Methods, Protocols, and Technology Reports that describe how to implement neural networks will also be welcome. In addition, we expect to include theoretical work focusing on computational aspects of neural networks and their properties (insofar as these are directly of interest to cognitive science), as well as a small number of review papers.
Submitted abstracts will be reviewed by the Topic Editors both for their overall quality and for their relevance to the themes of the Research Topic, which emphasize contemporary neural network architectures, application to large-scale data, novel theoretical developments, and close engagement with behavioral or neural measures. Frontiers hosts several article types, any of which could be appropriate. Please include in the abstract a statement on the proposed article type. Authors of accepted abstracts will be invited to submit a manuscript for peer review. Acceptance of the abstract in no way implies that the manuscript will be accepted for publication. Note that only manuscripts based on accepted abstracts can be submitted.