Skip to main content

ORIGINAL RESEARCH article

Front. Comput. Sci.

Sec. Human-Media Interaction

Volume 7 - 2025 | doi: 10.3389/fcomp.2025.1554320

This article is part of the Research Topic Emotional Intelligence AI in Mental Health View all 4 articles

The first look: A biometric analysis of emotion recognition using key facial features

Provisionally accepted
Ana M.S. Gonzalez-Acosta Ana M.S. Gonzalez-Acosta 1Marciano Vargas-Treviño Marciano Vargas-Treviño 1*Patricia Batres-Mendoza Patricia Batres-Mendoza 1Erick I. Guerra-Hernandez Erick I. Guerra-Hernandez 1Jaime Gutierrez-Gutierrez Jaime Gutierrez-Gutierrez 1Jose L. Cano-Perez Jose L. Cano-Perez 1Manuel A. Solis-Arrazola Manuel A. Solis-Arrazola 2Horacio Rostro-Gonzalez Horacio Rostro-Gonzalez 2,3*
  • 1 Benito Juárez Autonomous University of Oaxaca, Oaxaca, Oaxaca, Mexico
  • 2 Department of Electronic Engineering, University of Guanajuato, Guanajuato, Guanajuato, Mexico
  • 3 Ramon Llull University, Barcelona, Catalonia, Spain

The final, formatted version of the article will be published soon.

    Facial expressions are pivotal in human emotion recognition and social interaction. While prior studies have underscored the significance of the eyes and mouth, few have validated these findings with robust biometric evidence. This research explores the prioritization of facial features during emotion recognition, introducing a streamlined approach to landmark-based analysis with enhanced efficiency. An experiment was conducted with 30 participants, who evaluated images representing emotions such as anger, disgust, fear, neutrality, sadness, and happiness. Eye-tracking technology was employed to capture participants' gaze patterns, revealing the regions of focus during the recognition process. The findings confirmed a consistent prioritization of the eyes and mouth, with limited attention directed to other facial areas. Based on these insights, we propose a reduced facial landmark model that focuses on the most critical regions for emotion detection, significantly optimizing the traditional 68-point model to only 24 points without compromising accuracy enabling faster processing. This model has been validated using multiple classifiers, including Multi-Layer Perceptron (MLP), Random Decision Forest (RDF), and Support Vector Machine (SVM), demonstrating its robustness across different machine learning approaches.

    Keywords: emotion recognition, Eye-tracking analysis, Facial landmarks, Biometric validation, machine learning and AI

    Received: 01 Jan 2025; Accepted: 28 Feb 2025.

    Copyright: © 2025 Gonzalez-Acosta, Vargas-Treviño, Batres-Mendoza, Guerra-Hernandez, Gutierrez-Gutierrez, Cano-Perez, Solis-Arrazola and Rostro-Gonzalez. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

    * Correspondence:
    Marciano Vargas-Treviño, Benito Juárez Autonomous University of Oaxaca, Oaxaca, Oaxaca, Mexico
    Horacio Rostro-Gonzalez, Department of Electronic Engineering, University of Guanajuato, Guanajuato, 36885, Guanajuato, Mexico

    Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

    Research integrity at Frontiers

    Man ultramarathon runner in the mountains he trains at sunset

    94% of researchers rate our articles as excellent or good

    Learn more about the work of our research integrity team to safeguard the quality of each article we publish.


    Find out more