Emotion AI, also known as affective computing or artificial emotional intelligence, refers to technologies for measuring, understanding, simulating, and reacting to human emotions, and using that knowledge to improve everything in people’s social life such as healthcare and education. In the academic community, Emotion AI is an interdisciplinary field that absorbs researchers from psychology and computer science, neuroscience, and medical science. Humans’ emotional behaviors, including facial expressions, body gestures, vocal inflections, and physiological signals, are explored and analyzed, further interactively promoting theory-building and technology development. In industry, Emotion AI has been widely employed in many applications, such as commercial marketing, mental health, automotive service, and wearable devices. However, despite the significant progress and achievements, Emotion AI still has significant challenges in terms of solidifying its principles, and robust algorithms, creating trustworthy platforms and setting ethical standards.
From the beginning of the COVID-19 pandemic, the trend of ‘remote +’ has become popular in all walks of life. As a vital technology for learning and recognizing human emotions, Emotion AI can significantly contribute to the performance of those remote applications.
This research topic aims at novel theories, methodologies, resources, and strategies of Emotion AI from interdisciplinary perspectives, emphasizing eHealth and digital healthcare. Specifically, there are three main objectives as follows:
•New findings in psychology, neuroscience, and cognitive science are expected. For example, what theory can be used to develop Emotion AI? What emotional indicators infer physical and mental disorders? What are the regulations in emotional interactions during learning?
•New resources, algorithms, and tools are expected. For example, what kinds of datasets are suitable and helpful for Emotion AI development? How does multimodal data facilitate Emotion AI methods, including 2D/3D/Depth images & videos, skeleton points, acoustic records, and physiological signals? How to deploy Emotion AI on handheld or wearable devices?
•New social surveys, impact analysis, and solutions are welcome. For example, what are the benefits and costs of Emotion AI technology? What are the promise and influences of Emotion AI on healthcare and education? How to ensure the security, trustability, and transparency of Emotion AI.
Researchers should look to submit but are not limited to:
•Original research, review, methods, hypothesis and theory, perspective, data report, and opinion papers related to Emotion AI.
•Emotional theories in psychology, physiology, and neuroscience, e.g., brain processing, cognitive mechanisms, perception of human behaviors, and developmental ecologies.
•Automatic 2D/3D facial affect analysis, e.g., facial expression recognition, action unit detection, and micro-expression recognition.
•Multi-modal Emotion AI, including body gestures, physiological signals (e.g., rPPG, GSR, and ECG), neural signals (e.g., EEG, MEG, fNIRS, and fMRI), audio, and texts.
•Non-basic emotion study, e.g., engagement, stress, pain, depression, fatigue, compound expression, and valence & arousal.
•eHealth with Emotion AI, e.g., remote medicine, non-invasive diagnosis, real-time assessment, and nonverbal communication with disabled patients.
•Digital education with Emotion AI, e.g., quality evaluation of online courses, collaborative learning, interaction & regulation, and high-performing education ecosystem.
•Social investigation and technical solution for ethics, transparency, privacy, and trustability of Emotion AI.
Emotion AI, also known as affective computing or artificial emotional intelligence, refers to technologies for measuring, understanding, simulating, and reacting to human emotions, and using that knowledge to improve everything in people’s social life such as healthcare and education. In the academic community, Emotion AI is an interdisciplinary field that absorbs researchers from psychology and computer science, neuroscience, and medical science. Humans’ emotional behaviors, including facial expressions, body gestures, vocal inflections, and physiological signals, are explored and analyzed, further interactively promoting theory-building and technology development. In industry, Emotion AI has been widely employed in many applications, such as commercial marketing, mental health, automotive service, and wearable devices. However, despite the significant progress and achievements, Emotion AI still has significant challenges in terms of solidifying its principles, and robust algorithms, creating trustworthy platforms and setting ethical standards.
From the beginning of the COVID-19 pandemic, the trend of ‘remote +’ has become popular in all walks of life. As a vital technology for learning and recognizing human emotions, Emotion AI can significantly contribute to the performance of those remote applications.
This research topic aims at novel theories, methodologies, resources, and strategies of Emotion AI from interdisciplinary perspectives, emphasizing eHealth and digital healthcare. Specifically, there are three main objectives as follows:
•New findings in psychology, neuroscience, and cognitive science are expected. For example, what theory can be used to develop Emotion AI? What emotional indicators infer physical and mental disorders? What are the regulations in emotional interactions during learning?
•New resources, algorithms, and tools are expected. For example, what kinds of datasets are suitable and helpful for Emotion AI development? How does multimodal data facilitate Emotion AI methods, including 2D/3D/Depth images & videos, skeleton points, acoustic records, and physiological signals? How to deploy Emotion AI on handheld or wearable devices?
•New social surveys, impact analysis, and solutions are welcome. For example, what are the benefits and costs of Emotion AI technology? What are the promise and influences of Emotion AI on healthcare and education? How to ensure the security, trustability, and transparency of Emotion AI.
Researchers should look to submit but are not limited to:
•Original research, review, methods, hypothesis and theory, perspective, data report, and opinion papers related to Emotion AI.
•Emotional theories in psychology, physiology, and neuroscience, e.g., brain processing, cognitive mechanisms, perception of human behaviors, and developmental ecologies.
•Automatic 2D/3D facial affect analysis, e.g., facial expression recognition, action unit detection, and micro-expression recognition.
•Multi-modal Emotion AI, including body gestures, physiological signals (e.g., rPPG, GSR, and ECG), neural signals (e.g., EEG, MEG, fNIRS, and fMRI), audio, and texts.
•Non-basic emotion study, e.g., engagement, stress, pain, depression, fatigue, compound expression, and valence & arousal.
•eHealth with Emotion AI, e.g., remote medicine, non-invasive diagnosis, real-time assessment, and nonverbal communication with disabled patients.
•Digital education with Emotion AI, e.g., quality evaluation of online courses, collaborative learning, interaction & regulation, and high-performing education ecosystem.
•Social investigation and technical solution for ethics, transparency, privacy, and trustability of Emotion AI.