AUTHOR=Na In-seop , Aldrees Asma , Hakeem Abeer , Mohaisen Linda , Umer Muhammad , AlHammadi Dina Abdulaziz , Alsubai Shtwai , Innab Nisreen , Ashraf Imran TITLE=FacialNet: facial emotion recognition for mental health analysis using UNet segmentation with transfer learning model JOURNAL=Frontiers in Computational Neuroscience VOLUME=18 YEAR=2024 URL=https://www.frontiersin.org/journals/computational-neuroscience/articles/10.3389/fncom.2024.1485121 DOI=10.3389/fncom.2024.1485121 ISSN=1662-5188 ABSTRACT=

Facial emotion recognition (FER) can serve as a valuable tool for assessing emotional states, which are often linked to mental health. However, mental health encompasses a broad range of factors that go beyond facial expressions. While FER provides insights into certain aspects of emotional well-being, it can be used in conjunction with other assessments to form a more comprehensive understanding of an individual's mental health. This research work proposes a framework for human FER using UNet image segmentation and transfer learning with the EfficientNetB4 model (called FacialNet). The proposed model demonstrates promising results, achieving an accuracy of 90% for six emotion classes (happy, sad, fear, pain, anger, and disgust) and 96.39% for binary classification (happy and sad). The significance of FacialNet is judged by extensive experiments conducted against various machine learning and deep learning models, as well as state-of-the-art previous research works in FER. The significance of FacialNet is further validated using a cross-validation technique, ensuring reliable performance across different data splits. The findings highlight the effectiveness of leveraging UNet image segmentation and EfficientNetB4 transfer learning for accurate and efficient human facial emotion recognition, offering promising avenues for real-world applications in emotion-aware systems and effective computing platforms. Experimental findings reveal that the proposed approach performs substantially better than existing works with an improved accuracy of 96.39% compared to existing 94.26%.