Emotion is fundamental to human experiences influencing our daily activities including cognition, communication, learning, and decision making etc. For centuries, psychologists have tried to understand and define emotions. The most widely recognized basic emotions proposed by Ekman were happiness, sadness, fear, anger, surprise and disgust. Humans express and recognize emotions in multiple ways in which facial expression of emotions is one of the heavily studied areas. Recent neuroimaging studies show that a number of brain regions become active when viewing emotional facial expressions which provide the potential for computer-aided emotion recognition tools. Emotion recognition has extremely broad application prospects in real life, such as in human-computer interaction, virtual reality, intelligent robots, and home health care.
In the early days, researchers studied human emotional states through human external manifestations, such as speech or body language. As technology improves, researchers are now able to study human emotions based on physiological signals collected from sensor devices, including modalities like electroencephalograph (EEG), electrocardiography (ECG), or electromyogram (EMG).
At present, emotion recognition based on the brain-computer interface is still in its infancy. Traditional artificial intelligence technology is highly dependent on hand-crafted features and not adaptive to multi-modality missions. To overcome these defects, advanced artificial intelligence technologies, such as deep learning, transfer learning, multimodal information fusion, etc. can be used in analyzing and processing complex data sets. Deep learning has powerful feature extraction capabilities and can extract low-level features into abstract high-level features. By combining BCI technology and advanced AI technology, better emotion recognition results can be expected.
In this Research Topic, we aim to further explore human emotion recognition using new technologies combining BCI and deep learning. Sub-topics include but are not limited to the following:
• Affective computing, automatic emotion detection, analysis, and recognition.
• Multimodal pattern analysis.
• Emotion recognition based on multi-modal information fusion or multi-task learning.
• BCI system online, offline, or a combination of both.
• Users’ interaction with AI, and the visual interaction interface.
• Application of emotion recognition, e.g., distance education, human-computer interaction, virtual reality, safety monitoring, intelligent robots, home health care, etc.
Emotion is fundamental to human experiences influencing our daily activities including cognition, communication, learning, and decision making etc. For centuries, psychologists have tried to understand and define emotions. The most widely recognized basic emotions proposed by Ekman were happiness, sadness, fear, anger, surprise and disgust. Humans express and recognize emotions in multiple ways in which facial expression of emotions is one of the heavily studied areas. Recent neuroimaging studies show that a number of brain regions become active when viewing emotional facial expressions which provide the potential for computer-aided emotion recognition tools. Emotion recognition has extremely broad application prospects in real life, such as in human-computer interaction, virtual reality, intelligent robots, and home health care.
In the early days, researchers studied human emotional states through human external manifestations, such as speech or body language. As technology improves, researchers are now able to study human emotions based on physiological signals collected from sensor devices, including modalities like electroencephalograph (EEG), electrocardiography (ECG), or electromyogram (EMG).
At present, emotion recognition based on the brain-computer interface is still in its infancy. Traditional artificial intelligence technology is highly dependent on hand-crafted features and not adaptive to multi-modality missions. To overcome these defects, advanced artificial intelligence technologies, such as deep learning, transfer learning, multimodal information fusion, etc. can be used in analyzing and processing complex data sets. Deep learning has powerful feature extraction capabilities and can extract low-level features into abstract high-level features. By combining BCI technology and advanced AI technology, better emotion recognition results can be expected.
In this Research Topic, we aim to further explore human emotion recognition using new technologies combining BCI and deep learning. Sub-topics include but are not limited to the following:
• Affective computing, automatic emotion detection, analysis, and recognition.
• Multimodal pattern analysis.
• Emotion recognition based on multi-modal information fusion or multi-task learning.
• BCI system online, offline, or a combination of both.
• Users’ interaction with AI, and the visual interaction interface.
• Application of emotion recognition, e.g., distance education, human-computer interaction, virtual reality, safety monitoring, intelligent robots, home health care, etc.