Recently, the use of interactive systems has dramatically increased, leading to a growing interest in the improvement of human–machine interaction (HMI). In particular, artificial intelligence-based systems have been developed to assist the individual in accomplishing several challenging and demanding tasks, and to realize a human in the loop approach in order to reach different goals. However, in order to optimize the HMI, to adapt in real-time the machine’s behavior according to the neurophysiological condition of the user, and to preserve his/her safety, monitoring the human cognitive and affective processes is a crucial issue. Specifically, monitoring the mental workload (MWL) for HMI applications during the execution of a demanding task and in challenging situations allows for the detection of states of mental overload, which results in a stressful state and decreased performance. Moreover, affective computing techniques allow for monitoring the affective condition of the user, detecting possible situations of stress and anxiety, which can be detrimental to human performance, but also boredom states that are often associated with lower effort and higher frustration, struggle, and fatigue.
In order to investigate the users’ brain states and affective reactions during the HMI, questionnaires and interviews are commonly used. Given the technological improvement of mobile and wearable sensors (e.g., wearable and portable EEG and fNIRS systems, smart devices for physiological signals collection), and contactless devices (e.g., compact infrared thermal cameras, RGB-D cameras), it is possible to automate this evaluation providing a quantitative assessment. HMI research focuses on the continuous monitoring of neurophysiological signals to detect users’ cognitive and affective processes during their interplay with computers and robotic companions. Supported by artificial intelligence and machine learning algorithms, it is possible to classify different emotions or reactions evoked during the human-robot interaction to be provided as input to control the robot’s performance. The possibility to define feedback regarding the users’ neurophysiological condition can foster the development of interactive computer and robotic systems in a user-oriented way. Specifically, neuroadaptive systems can be enabled to adjust the machine’s behavior and assistance in real-time, according to the current cognitive and affective users’ state also with the aim of preserving human safety. Additionally, BCI with emotional feedback may provide an advantageous framework for affective human-computer interaction and emotional cognitive regulation.
In this context, the current Research Topic has the goal of collecting recent advances in monitoring cognitive and affective states, emphasizing the technological improvements that could enhance the HMI in a user-oriented way. Multidisciplinary approaches to the topic are fostered (e.g., neuroimaging, physiological, cognitive, psychological, morphological). Original Research, Systematic Reviews and Meta-analyses, Literature review, Mini-review, Methods, and Perspective articles can be submitted to the Research Topic.
Submissions covering, but not limited to, the following domains are encouraged:
• Methods in Design and Evaluation of HMI
• Affective Computing for HMI
• Artificial intelligence
• Physiological processes modeling
• Improvements in physiological signals data analysis for HMI
• Wearable and portable neuroimaging techniques for HMI
• Technological advancement in wearable devices for physiological signals acquisition
• Neuroadaptive Systems
• Machine Learning algorithms
• Human-in-the-loop approach
• Imaging processes for emotion recognition
• Emotion Recognition Control Systems
• Brain Computer Interface
• Mental Workload Assessment
Recently, the use of interactive systems has dramatically increased, leading to a growing interest in the improvement of human–machine interaction (HMI). In particular, artificial intelligence-based systems have been developed to assist the individual in accomplishing several challenging and demanding tasks, and to realize a human in the loop approach in order to reach different goals. However, in order to optimize the HMI, to adapt in real-time the machine’s behavior according to the neurophysiological condition of the user, and to preserve his/her safety, monitoring the human cognitive and affective processes is a crucial issue. Specifically, monitoring the mental workload (MWL) for HMI applications during the execution of a demanding task and in challenging situations allows for the detection of states of mental overload, which results in a stressful state and decreased performance. Moreover, affective computing techniques allow for monitoring the affective condition of the user, detecting possible situations of stress and anxiety, which can be detrimental to human performance, but also boredom states that are often associated with lower effort and higher frustration, struggle, and fatigue.
In order to investigate the users’ brain states and affective reactions during the HMI, questionnaires and interviews are commonly used. Given the technological improvement of mobile and wearable sensors (e.g., wearable and portable EEG and fNIRS systems, smart devices for physiological signals collection), and contactless devices (e.g., compact infrared thermal cameras, RGB-D cameras), it is possible to automate this evaluation providing a quantitative assessment. HMI research focuses on the continuous monitoring of neurophysiological signals to detect users’ cognitive and affective processes during their interplay with computers and robotic companions. Supported by artificial intelligence and machine learning algorithms, it is possible to classify different emotions or reactions evoked during the human-robot interaction to be provided as input to control the robot’s performance. The possibility to define feedback regarding the users’ neurophysiological condition can foster the development of interactive computer and robotic systems in a user-oriented way. Specifically, neuroadaptive systems can be enabled to adjust the machine’s behavior and assistance in real-time, according to the current cognitive and affective users’ state also with the aim of preserving human safety. Additionally, BCI with emotional feedback may provide an advantageous framework for affective human-computer interaction and emotional cognitive regulation.
In this context, the current Research Topic has the goal of collecting recent advances in monitoring cognitive and affective states, emphasizing the technological improvements that could enhance the HMI in a user-oriented way. Multidisciplinary approaches to the topic are fostered (e.g., neuroimaging, physiological, cognitive, psychological, morphological). Original Research, Systematic Reviews and Meta-analyses, Literature review, Mini-review, Methods, and Perspective articles can be submitted to the Research Topic.
Submissions covering, but not limited to, the following domains are encouraged:
• Methods in Design and Evaluation of HMI
• Affective Computing for HMI
• Artificial intelligence
• Physiological processes modeling
• Improvements in physiological signals data analysis for HMI
• Wearable and portable neuroimaging techniques for HMI
• Technological advancement in wearable devices for physiological signals acquisition
• Neuroadaptive Systems
• Machine Learning algorithms
• Human-in-the-loop approach
• Imaging processes for emotion recognition
• Emotion Recognition Control Systems
• Brain Computer Interface
• Mental Workload Assessment