Brain-Inspired Intelligence explores systematic theories, methodologies, and techniques to imitate and enhance human intelligence, which has been paid long-lasting enthusiasms from human society. Existing studies get astonishing achievements from diverse technique directions of deep learning, neuromorphic computing, and brain-scale simulations. These evolving techniques push the perception and cognition performance of artificial intelligence to an unprecedented level, exhibiting the trends that intelligent systems become increasingly enormous and complex. They build vast-scale and powerful models with a huge volume of data inputs and exascale-model architectures training or computing on larger distributed platforms. For instance, the largest language model GPT-3 consists of 17 billions of parameters being fed in 570GB input data for training. The Brain Simulator aims to reconstruct the human brain containing 86 billions of neurons with an average of 7000 synapses for every neuron. How to effectively build large but efficient artificial systems becomes an urgent and challenging problem. The increasing system scale and complexity drive the automated optimization and self-evolving demands during system establishment.
This research topic will be focused on the automated optimization techniques across a large spectrum of the Brain-Inspired intelligence system stacks. Specifically, interests include 1) the automation techniques and design tools to support complex system functionalities, the efficient management, and good programmability of neuromorphic computing; 2) The automated machine learning techniques to optimize the hardware or system-friendly deployable models for better efficiency; 3) The framework and system innovations to coordinate the large-scale training of giant models on the heterogeneous hardware implementations and infrastructures; In summary, contributions ranging from technically focused automated system management and run-time libraries optimization, domain reviews, and perspectives on the present and future trends of brain-inspired system innovations are all welcome. The topic is designed to be inclusive and give voice to a broad variety of views in order to stimulate debate and cross-pollination in the system and algorithm community. To this purpose, we welcome articles addressing the following:
Topics of interest to this issue include, but not limited to:
1) Automated machine learning techniques for neuromorphic algorithms under time/space/power constraints;
2) Automated optimization techniques of programming languages, design tools, and libraries for neuromorphic systems;
3) Reconfigurable energy-efficient deep learning and neuromorphic architectures;
4) Hardware models for large-scale biology-inspired learning;
5) Future interconnect for scalable neuromorphic architectures;
Brain-Inspired Intelligence explores systematic theories, methodologies, and techniques to imitate and enhance human intelligence, which has been paid long-lasting enthusiasms from human society. Existing studies get astonishing achievements from diverse technique directions of deep learning, neuromorphic computing, and brain-scale simulations. These evolving techniques push the perception and cognition performance of artificial intelligence to an unprecedented level, exhibiting the trends that intelligent systems become increasingly enormous and complex. They build vast-scale and powerful models with a huge volume of data inputs and exascale-model architectures training or computing on larger distributed platforms. For instance, the largest language model GPT-3 consists of 17 billions of parameters being fed in 570GB input data for training. The Brain Simulator aims to reconstruct the human brain containing 86 billions of neurons with an average of 7000 synapses for every neuron. How to effectively build large but efficient artificial systems becomes an urgent and challenging problem. The increasing system scale and complexity drive the automated optimization and self-evolving demands during system establishment.
This research topic will be focused on the automated optimization techniques across a large spectrum of the Brain-Inspired intelligence system stacks. Specifically, interests include 1) the automation techniques and design tools to support complex system functionalities, the efficient management, and good programmability of neuromorphic computing; 2) The automated machine learning techniques to optimize the hardware or system-friendly deployable models for better efficiency; 3) The framework and system innovations to coordinate the large-scale training of giant models on the heterogeneous hardware implementations and infrastructures; In summary, contributions ranging from technically focused automated system management and run-time libraries optimization, domain reviews, and perspectives on the present and future trends of brain-inspired system innovations are all welcome. The topic is designed to be inclusive and give voice to a broad variety of views in order to stimulate debate and cross-pollination in the system and algorithm community. To this purpose, we welcome articles addressing the following:
Topics of interest to this issue include, but not limited to:
1) Automated machine learning techniques for neuromorphic algorithms under time/space/power constraints;
2) Automated optimization techniques of programming languages, design tools, and libraries for neuromorphic systems;
3) Reconfigurable energy-efficient deep learning and neuromorphic architectures;
4) Hardware models for large-scale biology-inspired learning;
5) Future interconnect for scalable neuromorphic architectures;