Healthcare wearables allow researchers to develop various system approaches that recognize and understand the human emotional experience. Previous research has indicated that machine learning classifiers, such as Support Vector Machine (SVM), K-Nearest Neighbors (KNN), and Decision Tree (DT), can improve the accuracy of physiological signal analysis and emotion recognition. However, various emotions can have distinct effects on physiological signal alterations. Therefore, solely relying on a single type of physiological signal analysis is insufficient for accurately recognizing and understanding human emotional experiences.
Research on multi-modal emotion recognition systems (ERS) has commonly gathered physiological signals using expensive devices, which required participants to remain in fixed positions in the lab setting. This limitation restricts the potential for generalizing the ERS technology for peripheral use in daily life. Therefore, considering the convenience of data collection from everyday devices, we propose a multi-modal physiological signals-based ERS based on peripheral signals, utilizing the DEAP database. The physiological signals selected for analysis include photoplethysmography (PPG), galvanic skin response (GSR), and skin temperature (SKT). Signal features were extracted using the “Toolbox for Emotional Feature Extraction from Physiological Signals” (TEAP) library and further analyzed with three classifiers: SVM, KNN, and DT.
The results showed improved accuracy in the proposed system compared to a single-modal ERS application, which also outperformed current DEAP multi-modal ERS applications.
This study sheds light on the potential of combining multi-modal peripheral physiological signals in ERS for ubiquitous applications in daily life, conveniently captured using smart devices.