About this Research Topic
From the beginning of the COVID-19 pandemic, the trend of ‘remote +’ has become popular in all walks of life. As a vital technology for learning and recognizing human emotions, Emotion AI can significantly contribute to the performance of those remote applications.
This research topic aims at novel theories, methodologies, resources, and strategies of Emotion AI from interdisciplinary perspectives, emphasizing eHealth and digital healthcare. Specifically, there are three main objectives as follows:
•New findings in psychology, neuroscience, and cognitive science are expected. For example, what theory can be used to develop Emotion AI? What emotional indicators infer physical and mental disorders? What are the regulations in emotional interactions during learning?
•New resources, algorithms, and tools are expected. For example, what kinds of datasets are suitable and helpful for Emotion AI development? How does multimodal data facilitate Emotion AI methods, including 2D/3D/Depth images & videos, skeleton points, acoustic records, and physiological signals? How to deploy Emotion AI on handheld or wearable devices?
•New social surveys, impact analysis, and solutions are welcome. For example, what are the benefits and costs of Emotion AI technology? What are the promise and influences of Emotion AI on healthcare and education? How to ensure the security, trustability, and transparency of Emotion AI.
Researchers should look to submit but are not limited to:
•Original research, review, methods, hypothesis and theory, perspective, data report, and opinion papers related to Emotion AI.
•Emotional theories in psychology, physiology, and neuroscience, e.g., brain processing, cognitive mechanisms, perception of human behaviors, and developmental ecologies.
•Automatic 2D/3D facial affect analysis, e.g., facial expression recognition, action unit detection, and micro-expression recognition.
•Multi-modal Emotion AI, including body gestures, physiological signals (e.g., rPPG, GSR, and ECG), neural signals (e.g., EEG, MEG, fNIRS, and fMRI), audio, and texts.
•Non-basic emotion study, e.g., engagement, stress, pain, depression, fatigue, compound expression, and valence & arousal.
•eHealth with Emotion AI, e.g., remote medicine, non-invasive diagnosis, real-time assessment, and nonverbal communication with disabled patients.
•Digital education with Emotion AI, e.g., quality evaluation of online courses, collaborative learning, interaction & regulation, and high-performing education ecosystem.
•Social investigation and technical solution for ethics, transparency, privacy, and trustability of Emotion AI.
Keywords: Affective computing, eHealth, Digital education, Emotional processing, Cognitive mechanism, Human interaction, Trustworthy AI, Emotional AI
Important Note: All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.