Skip to main content

BRIEF RESEARCH REPORT article

Front. Hum. Neurosci., 09 August 2024
Sec. Motor Neuroscience
This article is part of the Research Topic Postural Control Priorities and Effective Motor Learning View all articles

Stepping into emotions: investigating the effect of angry and fearful faces on forward stepping and quiet stance

  • 1Unitat de Recerca en Neurociència Cognitiva, Departament de Psiquiatria i Medicina Legal, Universitat Autònoma de Barcelona, Barcelona, Spain
  • 2Hospital del Mar Research Institute, Barcelona, Spain
  • 3Department of Human Movement Sciences, Faculty of Behavioural and Movement Sciences, Vrije Universiteit Amsterdam, Amsterdam Movement Sciences, Amsterdam, Netherlands

Introduction: Facial expressions conveying an emotion may affect social interactions, such as approach- or avoidance-related behaviors. A specific facial feature is the gaze direction. An emotional facial expression such as anger will elicit distinct behavioral tendencies, depending on whether the angry gaze is directed toward the onlooker, or in a different direction. We tested whether facial expressions of anger and fear, combined with direct or averted gaze, elicit approach- or avoidance tendencies, using a go/no-go variant of the whole-body stepping task.

Method: Healthy adults stood on a force plate, recording the center of pressure (COP). Participants were presented with angry or fearful faces; either with direct or averted gaze. Participants had to identify the emotion, and “depending on instructions- either make a single step forward, or remain in a quiet stance. From the COP of the forward steps, we derived parameters such as reaction time and step size. From the quiet standing trials we derived parameters of postural sway, indicative of postural “freeze.” We used analysis of variance to analyze the outcomes.

Results and discussion: First, we found that steps were initiated faster with angry faces than with fearful faces, in line with existing literature. Second, we did not observe a significant effect of gaze direction. Forward steps with direct and averted gaze had similar COP characteristics. Finally, we had expected to find freeze (postural immobility) with fearful faces, but this was also not observed. We discuss various explanations for the finding, and implications for research into the motoric grounding of social interactions.

1 Introduction

The ability to produce and recognize emotional facial expressions has a long evolutionary history, characterized by the contraction of specific facial muscles and their adaptive function in both humans and animals (Waller and Micheletta, 2013). Over time, these basic facial responses have evolved to be used in more complex nonverbal communication processes, such as interpreting the intentions and goals of others (Erickson and Schulkin, 2003). Emotional processing likely serves a beneficial evolutionary purpose by allowing an organism to quickly adapt to a changing environment (Frijda, 2010; Lang and Bradley, 2010; Roelofs, 2017). For example, an angry facial expression may involve displaying a set of sharp teeth. This may signal potential threat, and this induces an avoidance reaction in the observer, thereby motivating them to move away from danger. This triggering of goal-directed behavior in response to emotional faces suggests that perception and action have co-evolved and are thus tightly linked.

This connection can be understood in terms of behavioral affordances related to visual perception, which refer to opportunities for interaction or response to a visual stimulus (Gibson, 2014). Adams et al. (2017) propose that the response to facial perception can be understood in terms of behavioral affordances. These affordances are presumed to be innate but can be influenced by individuals' adjustments to their environment, which are related to their sensitivity to stimulus features and can vary significantly within an individual (i.e., personality) and across situations (i.e., context). In real life, emotions can occur simultaneously, for instance when facing a threatening situation one may experience both fear, anger and disgust at the same time (Dailey et al., 2002). Furthermore, the importance of emotions in shaping human behavior and decision-making processes is well-established in the literature (de Gelder et al., 2006; Botta et al., 2022). Emotions play a crucial role in decision-making by triggering adaptive behavioral responses that have evolved through evolution to help individuals navigate complex and ever-changing environments social. These responses guide individuals to pursue appetitive goals while steering clear of potential threats (Frijda, 1986; Frijda et al., 1989; Lang, 1995; Lang et al., 2013). Numerous researchers have delved into the influence of emotional stimuli on approach- or avoidance related motor responses, particularly examining their impact on directional arm movements, which has revealed varying effects across different emotional states. For instance, Wilkowski and Meier (2010) observed that angry expressions tend to facilitate approach behaviors. More specifically they showed that participants consistently exhibited faster initiation of approach movements toward angry facial expressions, compared to neutral or fearful expressions, particularly when physical approach was perceived as an effective strategy for addressing the social challenge posed by angry facial expressions.

Within this line of research, a recent series of studies has shed light on a key insight: emotional facial expressions elicit consistent and replicable behavioral responses exclusively when they align with participants' specific objectives, indicating a context-dependent nature of emotional effects (Mirabella, 2018; Mancini et al., 2020, 2022; Calbi et al., 2022). Illustrating this point, Mirabella (2018), using two versions of a Go/No-go task, showed that emotional facial expressions (happy and fearful) affect motor readiness and accuracy of reaching arm movements only when the stimuli are task-relevant. More specifically, when participants focused on emotional content, reaction time and omission errors were higher for fearful faces compared to happy stimuli. In contrast, there was no difference when participants were instructed to react to a neutral stimulus component, in this case gender. Such results have been replicated and extended by including angry faces (Mancini et al., 2020). Mancini et al. (2022) and Calbi et al. (2022) showed that inhibitory control, a key executive function, is impacted by facial or body posture emotions respectively, but only when they are task-relevant. The authors showed that the percentage of commission errors (i.e., instances in which participants moved although they had not) were higher for happy faces than for fearful faces.

Additionally, as proposed by Said et al. (2011), emotional facial expressions are not processed independently of contextual cues provided by the face. Other facial cues such as the gaze direction of another person may provide additional assistance in identifying emotional facial expressions (Adams and Kleck, 2003). Simply put, it matters a great deal if an angry expression is directed toward me, or toward my neighbor. According to the shared signal hypothesis (SSH) proposed by Adams and Kleck (2005), emotion processing is enhanced when the gaze direction corresponds to the motivational orientation of the expressed emotion. For example, expressions such as happiness or anger would be perceived more quickly and intensely by the observer when associated with direct gaze (sharing an approach motivation; Adams and Kleck, 2003, 2005; Willis et al., 2011; Pönkänen and Hietanen, 2012). Indeed, an angry face combined with a direct gaze might be more threatening to the observer than an averted gaze, as the direct gaze would designate the observer as the target of the threat. Conversely, an averted gaze coupled with a fearful face may signal a danger in the immediate environment, potentially hazardous to the observer (Adams and Franklin, 2009; Adams et al., 2017). The ability to detect the source of a threat requires accurate interpretation of emotional facial expressions itself and the direction of gaze, as these two signals interrelate. In social interactions, both emotion and gaze direction allow observers to simultaneously assess others' intentions and identify external cues that may be favorable or unfavorable, or even represent a threat. Consistent with previous findings, several authors observed that the perception of gaze direction was influenced by the emotional faces; participants were more likely to perceive angry faces as looking directly at them, while fearful faces were perceived as looking more toward the environment (Ewbank et al., 2009; Jun et al., 2013; Lebert et al., 2021).

In summary, based on the SSH, happy and angry faces would promote approach behaviors, while fearful, sad and disgusted faces would promote avoidance behaviors (Adams and Kleck, 2003, 2005; Sander et al., 2007; Willis et al., 2011; Pönkänen and Hietanen, 2012). A more recent, and potentially more ecological way of assessing the relationship between emotional stimuli and motivated behavior (i.e., approach or avoid) involves the measurement of whole-body movements. This method considers the fact that moving the entire body toward (approach) or away from (avoid) an emotional stimulus alters the physical distance between the individual and the stimulus. It is worth noting, however, that the effects on behaviors are not always clear-cut, and there are discrepancies in the reported results. Several authors have shown that pleasant stimuli lead to faster initiation of step forward compared to unpleasant stimuli (Gélat et al., 2011; Stins et al., 2011a; Yiou et al., 2014). Also, no effect of viewing unpleasant stimuli on organization of backward steps has been reported in the literature (Stins et al., 2011a; Yiou et al., 2014; Bouman and Stins, 2018). This lack of effect may be explained by the fact that in everyday life, escaping danger does not involve taking a backward step, which arguably leads to slower and less stable movements compared to the more “natural” forward step. Finally, Mirabella et al. (2022) demonstrated that emotional facial expressions significantly influence gait parameters only when participants are explicitly instructed to move according to the valence of the picture, facilitating an approach toward pleasant faces compared to unpleasant ones. To our knowledge, only Mirabella et al. (2022) performed a whole-body go no/go task, involving emotional items. In that study, only the go trials were analyzed, using an optoelectronic movement registration system. However, it can be argued that also the no/go trials are informative, because during quiet standing also effects of emotion can be demonstrated, such as postural “freeze” and spontaneous backward body displacements, when faced with a threat (Hillman et al., 2004; Azevedo et al., 2005; Facchinetti et al., 2006; Roelofs et al., 2010; Stins and Beek, 2011). In this regard, an insightful indicator for assessing the influence of an impending threat could be freezing behavior—a state in which the body undergoes physiological and somatic preparations for action when the threat is distant and fear levels are relatively low (Blanchard et al., 1986; Roelofs et al., 2010; Borgomaneri et al., 2015; Botta et al., 2022). During freezing, the body is temporarily immobile, which would be manifest as very little postural sway. Conversely, arousing items could lead to an increase in sway. In this regard, the study of Bouman and Stins (2018) (described above) observed an effect on the postural sway preceding the backward step, specifically an increase in postural sway for high arousal stimuli, regardless of valence.

The objective of this study is to delve into the intricate interplay between social emotional cues and postural behavior. More specifically, we aim to explore how specific emotions (anger and fear) combined with different gaze directions (direct and deviated) influence posture. While both emotions share an unpleasant valence, they have distinct motivational properties – anger promotes approach behaviors (Carver and Harmon-Jones, 2009) while fear triggers freezing and avoidance behaviors. To this end, we used a force plate that allows recording of the body center of pressure, both during go trials (forward stepping) and no/go trials (quiet stance) in response to emotional faces of anger or fear, combined with either a direct or deviated gaze. Regarding the planning and organization of a forward step, we predict that the reaction times will be shorter in response to angry faces relative to fearful faces. Moreover, this effect will be increased for potentially threatening conditions to the observer (angry faces with direct gaze and fearful faces with deviated gaze) compared to less threatening conditions (angry faces with deviated gaze and fearful faces with direct gaze). Concerning static posture, we hypothesize to observe a stronger freezing behavior (i.e., greater reduction in sway) in response to fearful faces relative to angry faces, and even more so for conditions that are potentially threatening to the observer due to the gaze direction.

2 Method

2.1 Participants

Based on comparable studies investigating the effect of emotion on posture (Stins et al., 2011b; Eerland et al., 2012; Fawver et al., 2015) a sample of 22 adults (nine males; 13 females; mean age 26 y; sd 4.5 y) completed the study and was deemed sufficient. No power analysis was performed beforehand. Participants had normal or corrected-to-normal vision and had no neurological disorders or injuries that prevented them from participating. Twenty-one participants had a self-declared right-foot preference to kick a ball; one participant had a self-declared left-foot preference. All participants signed an informed consent form. The study was approved by the local ethical committee of the Vrije Universiteit, Amsterdam.

2.2 Stimuli

Stimuli consisted of a set of Caucasian faces generated using FaceGen Modeler software. Hair and other markings such as jewelry or tattoos were absent so that only the central face area was visible. There were two identities for both genders (M/F), displaying either an angry or fearful emotion. Importantly, in some cases the eye gaze was directly forward (i.e., facing the viewer), or deviated to the left, or deviated to the right. In the deviated gaze condition, the iris was shifted by nine pixels to the left or to the right, so that there is no ambiguity about the deviated direction of gaze. This resulted in a set of 24 unique facial stimuli: 2 (emotions: angry vs. fearful) * 2 (genders: female vs. male) * 2 (identities: first vs. second identity) * 3 (gaze directions: direct gaze vs shifted to the right vs to the left). Each stimulus was repeated twice (instructions: go vs. no-go trials). Example face stimuli are shown in Figure 1. Similar faces (but expressing a broader range of emotions) were used in previous studies where a pre-test was initially conducted with an independent sample of fifty participants (Lebert et al., 2021, 2024). We ensured that participants had correctly identified the displayed emotions according to their instructed motor responses. In the study employing these faces for the first time (Lebert et al., 2021), the visual angles of the face measured 10.57° (height) x 8.01° (width), with a 9-pixel deviation (i.e., deviated gaze) corresponding to a visual angle of 0.139°; whereas in our current study, the visual angles of the face measured 16.37° x 12.47°, with a 9-pixel deviation corresponding to a visual angle of 0.217°.

Figure 1
www.frontiersin.org

Figure 1. Example stimuli of three fearful faces (left) and three angry faces (right). The center face is a direct gaze, facing the observer. The center faces are flanked by faces with averted gaze; 9° leftward and 9° rightward.

2.3 Apparatus

Postural sway data were collected using a custom-made 1 x 1 m force plate, measuring forces in the x-, y-, and z-direction. From the forces we calculated the COP time series. Stimuli were shown at eye-level on a 55“ monitor, placed ~1.5 m from the starting position of the participant. In order to synchronize the onset of the visual stimuli with the continuous force plate measurement, we used a small light sensor which was attached to the bottom left corner of the screen. Together with each visual stimulus, a small white square (not visible to the participant) was presented. At stimulus offset the white square disappeared again. This resulted in a square wave pattern timed exactly with onset and offset of each stimulus. Data of the light sensor was sampled simultaneously with the force plate channels, enabling offline identification of stimulus events.

Initially, we adopted a very high sampling frequency of 1,000 Hz for the sole purpose of accurately identifying the onset of the white square (due to the monitor's inevitable flickering). However, during the course of the experiment the measurement PC sometimes exhibited memory problems, resulting in a premature abortion of data collection (see below). Consequently, we later adjusted the sampling frequency to 500 Hz. In the actual data analysis, we downsampled all data to 100 Hz, as postural frequencies exceeding this value are unlikely to have a biomechanical origin and instead reflect noise.

2.4 Procedure

We implemented a whole-body version of the go/no-go task. Participants were all positioned slightly at the rear of the force plate and were instructed to execute a single forward step as fast and accurately as possible upon the appearance of a “go” signal, while remaining motionless on the force plate when the “no-go” signal was presented. In other words, performance of the go/no-go task required accurate identification of the emotional expression (regardless of gaze direction). Participants were instructed to initiate all steps with their right leg. In one block of trials the go signal was the angry face while the no-go signal corresponded to the fearful face. Subsequently, after the block of trials was completed, the instruction was reversed, and the same 24 facial stimuli were presented again. Stimulus presentation was randomized and instructions order was counterbalanced between participants. We thus have a complete within design where all stimuli and all stimulus-response assignments are present. Preceding each trial block, participants underwent five practice trials.

The timing of events was as follows (see Figure 2): (1) a screen containing the instructions, shown prior to each trial. The instructions were displayed either in Dutch or in English (dependent on the preferred language of the participant) and remained on the screen for 5 s. (2) A 500 ms small white fixation cross in the center of the screen. (3) A black screen, lasting a random 2–4 s. (4) Presentation of the face stimulus for 5 s. In the case of a go stimulus, participants had to take a forward step and to remain still in the new (anterior) position of the force plate. In the case of a no-go stimulus participants had to maintain a quiet stance. (5) The monitor displayed the message “step back” for 3 s. (6) Finally a black screen for another 3 s. The entire experiment lasted about 40 min. Prior to the experiment, participants received written and verbal instructions. The required movement pattern (i.e., the step) was demonstrated by the experimenter.

Figure 2
www.frontiersin.org

Figure 2. Timeline of experimental events, including instruction display (5 s), fixation cross (500 ms), black screen (random 2–4 s), face stimulus presentation (5 s), participant responses to go and no-go stimuli, and instruction to “step back” (displayed for 3 s) before a final black screen (3 s).

2.5 Data analysis

Prior to data analysis we had to remove a total of five errors; four involving failing to step with a go trial, and one involving stepping with a no-go trial. This demonstrates that the emotions were generally easily identifiable.

COP time series were downsampled to 100 Hz and smoothed with a 5-point moving average. We used custom made Matlab scripts to identify the peaks in the light sensor signal, to identify the corresponding COP segments and to extract relevant postural parameters. For the go trials we analyzed a number of classical GI parameters: (1) reaction time, i.e., the time interval between stimulus onset and movement initiation. Following previous studies (e.g., Bouman et al., 2021), we identified movement onset as the moment in time at which the force in the anterior direction exceeded 5 N. (2) step size; the difference between the initial standing position and the final position. (3) APA; the distance along the APA axis between the initial COP position and the most posterior position, which for biomechanical reasons is related to the build up of speed in the forward direction (see Figure 3).

Figure 3
www.frontiersin.org

Figure 3. Example COP profile of a forward step (angry face; averted gaze). Each dot represents one sample (time interval 10 ms). The red circle denotes the reaction time for this particular step, i.e., the first discernible weight shift from quiet standing. The initial leftward displacement of the COP represents the unloading of the (right) swing leg and the loading of the left stance leg. This is followed by the forward acceleration, powered by the left foot. In this example the RT was 430 ms, and the step size 55 cm.

For the no-go trials we analyzed several parameters indicative of quiet standing performance: the standard deviation of the COP time series in the AP and ML direction, and the sway path length. We performed analyzes of variance on the postural outcome measures, separately for the go and the no go trials. The factors were emotion (angry/fearful) and gaze (direct or averted, i.e., we collapsed over left and right gaze deviation).

3 Results

Due to technical errors the data of one (female) participant had to be discarded. In addition, due to occasional premature termination of the data collection, 15 trials were lost.

3.1 Gait initiation

Prior to analysis we removed steps that were initiated too soon (< 200 ms) or too late (> 1,500 ms). This resulted in removal of 30 trials, but it never resulted in empty cells. Analysis of the RTs only revealed a main effect of emotion; F_(1, 20) = 12.21, p = 0.002; ηp2 = 0.38. This effect was due to the considerably faster step initiation with angry faces (607 ms) compared to fearful faces (691 ms; see Figure 4). The effect of gaze was not significant; F_(1, 20) = 0.73, p = 0.404; ηp2 = 0.03, nor was the interaction of emotion and gaze; F_(1, 20) = 0.67, p = 0.79; ηp2 = 0.00. Analysis of APA and of step size (with the same trials removed) yielded no significant main or interaction effects. The average step size was about 40 cm, ranging between 20 and 60 cm.

Figure 4
www.frontiersin.org

Figure 4. Box plots of reaction times (in ms) in response to emotional faces (angry vs. fearful) associated with either a direct or an averted gaze. Each dot represents the mean of a single participant; the “x” represents the condition mean, and the horizontal line is the median.

3.2 Quiet standing

Analysis of quiet standing yielded no significant main or interaction effects. The average standard deviation of sway in the AP direction was 3.2 mm; in the ML direction 3.5 mm. The average sway path length was 89.5 mm.

4 Discussion

In this study, we aimed to investigate the effect of angry and fearful faces on the control of forward stepping and quiet stance. In our predictions, we posited that anger and fear have distinct motivational properties of approach and avoidance, respectively, and that the combination of these emotions with gaze direction would further enhance these motivational tendencies, thus differentially affecting motor correlates (Adams and Kleck, 2005). Using a go/no-go paradigm, we measured both gait initiation and quiet standing parameters in response to emotional faces with either a direct or deviated gaze.

First, we predicted that gait initiation would be faster in response to angry faces compared to fearful faces, which is indeed what we found. This finding aligns with previous research showing that angry expressions tend to facilitate approach behaviors (Wilkowski and Meier, 2010; Mancini et al., 2020). This shorter reaction times when initiating forward steps in response to angry faces could reflect a faster motor readiness to approach potentially challenging social situations. It is essential to note that the distinction between fear and anger in terms of their impact on motor responses is not clear-cut in the existing literature. While fear and anger are often considered interchangeable negative emotions, our study highlights that they convey distinct social signals and, consequently, modulate actions differently. Specifically, angry faces, in contrast to fearful ones, are associated with a direct threat in the form of aggression toward the observer, thus eliciting an immediate need for action (Adams and Franklin, 2009; Adams et al., 2017).

We initially hypothesized that the combination of gaze direction and emotional facial expression would have a differential effect on forward stepping. Contrary to our expectations, we did not observe a significant impact of gaze direction on GI. As an explanation of this unexpected finding, one hypothesis could be that emotional facial expressions would have a more dominant influence on approach behavior than gaze direction. Especially in threatening situations, where efficient detection of danger is crucial, it appears that emotional expressions may determine motor responses more than gaze direction. We have no explanation for this finding, but it could be that gaze direction (which after all was task irrelevant in this study) does not affect performance, since the task was to identify the emotion regardless of gaze, and execute the assigned motor response. Note that in previous studies, gaze direction was a task-relevant parameter and did seem to affect emotional responses (Ewbank et al., 2009; Jun et al., 2013; Lebert et al., 2021) and postural control (Lebert et al., 2021).

Lastly, we focused on the impact of postural sway during no-go trials. We hypothesized that participants would exhibit stronger freezing behavior, characterized by a greater reduction in sway, in response to fearful faces compared to angry faces, especially in potentially threatening conditions (Azevedo et al., 2005; Facchinetti et al., 2006; Roelofs et al., 2010; Stins and Beek, 2011). However, this prediction was not confirmed, as we did not observe significant differences in postural variability between emotional conditions during quiet standing. Based on these results, we conclude that the freeze response, typically associated with threat detection in the initial moments, may not have been elicited to a significant extent. A potential reason could be that the experiment consisted of a mixture of quiet standing trials and gait initiation trials. Traditional studies on the freeze response only involve quiet standing recordings. This novel go /no-go paradigm may not be suitable to elicit a freezing response. The go / no-go paradigm is often used to measure sustained attention over a long period of time, whereas our paradigm aims to unravel spontaneous postural adjustments in response to emotional faces with either a direct or averted gaze.

In future studies a wider range of emotions should be employed. For present purposes we focused on two emotions (fear and anger) with differing motivational properties, but other emotions such as happiness and disgust, combined with various gaze angles, can be used to obtain more insight in how, and to what extent, there prime the motor system. Another limitation is that we did not collect data on personality type. The results of Lebert et al. (2020) suggest that postural effects are modulated by personality.

To our knowledge, this is the first study implementing a go/no-go paradigm to investigate the influence of emotional expressions combined with direct or deviated gaze on both forward stepping and quiet standing. Our findings suggest that facial cues do not carry equal weight in our social interactions and, more broadly, in perception-action links. Exploring how emotions and gaze direction modulate these responses in contexts more closely resembling real-life social situations can offer valuable insights into comprehending human social interactions.

Data availability statement

The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.

Ethics statement

The studies involving humans were approved by Vaste Commissie Wetenschap en Ethiek (The Scientific and Ethical Review Board; VCWE). The studies were conducted in accordance with the local legislation and institutional requirements. The participants provided their written informed consent to participate in this study.

Author contributions

AL: Conceptualization, Formal analysis, Investigation, Methodology, Writing – original draft, Writing – review & editing. OV: Writing – review & editing. JS: Conceptualization, Data curation, Formal analysis, Methodology, Software, Supervision, Writing – original draft, Writing – review & editing.

Funding

The author(s) declare financial support was received for the research, authorship, and/or publication of this article. The research was made possible through the support of the Eole grant, provided by the Franco-Dutch network (Eole Website).

Acknowledgments

We would like to thank Dorine Vergilino-Perez and Laurence Chaby for their support for this project during the Ph. D. of AL, and to Cynthia Buijs for her valuable assistance in planning and organizing the experimental sessions.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

The author(s) declared that they were an editorial board member of Frontiers, at the time of submission. This had no impact on the peer review process and the final decision.

Publisher's note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

Adams, R. B., Albohn, D. N., and Kveraga, K. (2017). Social vision: applying a social-functional approach to face and expression perception. Curr. Direct. Psychol. Sci. 26, 243–248. doi: 10.1177/0963721417706392

PubMed Abstract | Crossref Full Text | Google Scholar

Adams, R. B., and Franklin, R. G. (2009). Influence of emotional expression on the processing of gaze direction. Motivat. Emot. 33, 106–112. doi: 10.1007/s11031-009-9121-9

Crossref Full Text | Google Scholar

Adams, R. B., and Kleck, R. E. (2003). Perceived gaze direction and the processing of facial displays of emotion. Psychol. Sci. 14, 644–647. doi: 10.1046/j.0956-7976.2003.psci1479.x

Crossref Full Text | Google Scholar

Adams, R. B., and Kleck, R. E. (2005). Effects of direct and averted gaze on the perception of facially communicated emotion. Emotion 5, 3–11. doi: 10.1037/1528-3542.5.1.3

PubMed Abstract | Crossref Full Text | Google Scholar

Azevedo, T. M., Volchan, E., Imbiriba, L. A., Rodrigues, E. C., Oliveira, J. M., Oliveira, L. F., et al. (2005). A freezing-like posture to pictures of mutilation. Psychophysiology 42, 255–260. doi: 10.1111/j.1469-8986.2005.00287.x

PubMed Abstract | Crossref Full Text | Google Scholar

Blanchard, R. J., Flannelly, K. J., and Blanchard, D. C. (1986). Defensive behaviors of laboratory and wild rattus norvegicus.. J. Comparat. Psychol. 100:101.

Google Scholar

Borgomaneri, S., Vitale, F., Gazzola, V., and Avenanti, A. (2015). Seeing fearful body language rapidly freezes the observer's motor cortex. Cortex 65, 232–245. doi: 10.1016/j.cortex.2015.01.014

PubMed Abstract | Crossref Full Text | Google Scholar

Botta, A., Lagravinese, G., Bove, M., Pelosin, E., Bonassi, G., Avenanti, A., et al. (2022). Sensorimotor inhibition during emotional processing. Sci. Rep. 12:6998. doi: 10.1038/s41598-022-10981-8

PubMed Abstract | Crossref Full Text | Google Scholar

Bouman, D., and Stins, J. F. (2018). Back off! the effect of emotion on backward step initiation. Hum. Mov. Sci. 57, 280–290. doi: 10.1016/j.humov.2017.09.006

PubMed Abstract | Crossref Full Text | Google Scholar

Bouman, D., Stins, J. F., and Beek, P. J. (2021). Both distance change and movement goal affect whole-body approach-avoidance behavior. Psychol. Neurosci. 14:156. doi: 10.1037/pne0000255

PubMed Abstract | Crossref Full Text | Google Scholar

Calbi, M., Montalti, M., Pederzani, C., Arcuri, E., Umiltá, M. A., Gallese, V., et al. (2022). Emotional body postures affect inhibitory control only when task-relevant. Front. Psychol. 13:1035328. doi: 10.3389/fpsyg.2022.1035328

PubMed Abstract | Crossref Full Text | Google Scholar

Carver, C. S., and Harmon-Jones, E. (2009). Anger is an approach-related affect: evidence and implications. Psychol. Bullet. 135, 183–204. doi: 10.1037/a0013965

PubMed Abstract | Crossref Full Text | Google Scholar

Dailey, M. N., Cottrell, G. W., Padgett, C., and Adolphs, R. (2002). Empath: a neural network that categorizes facial expressions. J. Cogn. Neurosci. 14, 1158–1173. doi: 10.1162/089892902760807177

PubMed Abstract | Crossref Full Text | Google Scholar

de Gelder, B., Meeren, H. K., Righart, R., Van den Stock, J., Van de Riet, W. A., and Tamietto, M. (2006). Beyond the face: exploring rapid influences of context on face processing. Progr. Brain Res. 155, 37–48. doi: 10.1016/S0079-6123(06)55003-4

PubMed Abstract | Crossref Full Text | Google Scholar

Eerland, A., Guadalupe, T. M., Franken, I. H., and Zwaan, R. A. (2012). Posture as index for approach-avoidance behavior. PLoS ONE 7:e31291. doi: 10.1371/journal.pone.0031291

PubMed Abstract | Crossref Full Text | Google Scholar

Erickson, K., and Schulkin, J. (2003). Facial expressions of emotion: a cognitive neuroscience perspective. Brain Cogn. 52, 52–60. doi: 10.1016/S0278-2626(03)00008-3

PubMed Abstract | Crossref Full Text | Google Scholar

Ewbank, M. P., Jennings, C., and Calder, A. J. (2009). Why are you angry with me? facial expressions of threat influence perception of gaze direction. J. Vis. 9, 16–16. doi: 10.1167/9.12.16

PubMed Abstract | Crossref Full Text | Google Scholar

Facchinetti, L. D., Imbiriba, L. A., Azevedo, T. M., Vargas, C. D., and Volchan, E. (2006). Postural modulation induced by pictures depicting prosocial or dangerous contexts. Neurosci. Lett. 410, 52–56. doi: 10.1016/j.neulet.2006.09.063

PubMed Abstract | Crossref Full Text | Google Scholar

Fawver, B., Beatty, G. F., Naugle, K. M., Hass, C. J., and Janelle, C. M. (2015). Emotional state impacts center of pressure displacement before forward gait initiation. J. Appl. Biomech. 31, 35–40. doi: 10.1123/jab.2013-0306

PubMed Abstract | Crossref Full Text | Google Scholar

Frijda, N. H. (1986). The Emotions. Cambridge: Cambridge University Press.

Google Scholar

Frijda, N. H. (2010). Impulsive action and motivation. Biol. Psychol. 84, 570–579. doi: 10.1016/j.biopsycho.2010.01.005

PubMed Abstract | Crossref Full Text | Google Scholar

Frijda, N. H., Kuipers, P., and Ter Schure, E. (1989). Relations among emotion, appraisal, and emotional action readiness. J. Personal. Soc. Psychol. 57:212.

Google Scholar

Gélat, T., Coudrat, L., and Le Pellec, A. (2011). Gait initiation is affected during emotional conflict. Neurosci. Lett. 497, 64–67. doi: 10.1016/j.neulet.2011.04.030

PubMed Abstract | Crossref Full Text | Google Scholar

Gibson, J. J. (2014). The Ecological Approach to Visual Perception: Classic Edition. London: Psychology Press.

Google Scholar

Hillman, C. H., Rosengren, K. S., and Smith, D. P. (2004). Emotion and motivated behavior: postural adjustments to affective picture viewing. Biol. Psychol. 66, 51–62. doi: 10.1016/j.biopsycho.2003.07.005

PubMed Abstract | Crossref Full Text | Google Scholar

Jun, Y. Y., Mareschal, I., Clifford, C. W., and Dadds, M. R. (2013). Cone of direct gaze as a marker of social anxiety in males. Psychiat. Res. 210, 193–198. doi: 10.1016/j.psychres.2013.05.020

PubMed Abstract | Crossref Full Text | Google Scholar

Lang, P. J. (1995). The emotion probe: studies of motivation and attention. Am. Psychol. 50:372.

Google Scholar

Lang, P. J., and Bradley, M. M. (2010). Emotion and the motivational brain. Biol. Psychol. 84, 437–450. doi: 10.1016/j.biopsycho.2009.10.007

PubMed Abstract | Crossref Full Text | Google Scholar

Lang, P. J., Simons, R. F., Balaban, M., and Simons, R. (2013). Attention and Orienting: Sensory and Motivational Processes. London: Psychology Press.

Google Scholar

Lebert, A., Chaby, L., Guillin, A., Chekroun, S., and Vergilino-Perez, D. (2021). Are you “gazing” at me? how others' gaze direction and facial expression influence gaze perception and postural control. Front. Psychol. 12:730953. doi: 10.3389/fpsyg.2021.730953

PubMed Abstract | Crossref Full Text | Google Scholar

Lebert, A., Vergilino-Perez, D., and Chaby, L. (2024). Keeping distance or getting closer: how others' emotions shape approach-avoidance postural behaviors and preferred interpersonal distance. PLoS ONE 19:e0298069. doi: 10.1371/journal.pone.0298069

PubMed Abstract | Crossref Full Text | Google Scholar

Lebert, A., Vergillino-Perez, D., and Chaby, L. (2020). Pour une meilleure compréhension des liens réciproques entre émotion et posture. Revue de neuropsychologie 1, 11–18. doi: 10.1684/nrp.2019.0524

PubMed Abstract | Crossref Full Text | Google Scholar

Mancini, C., Falciati, L., Maioli, C., and Mirabella, G. (2020). Threatening facial expressions impact goal-directed actions only if task-relevant. Brain Sci. 10:794. doi: 10.3390/brainsci10110794

PubMed Abstract | Crossref Full Text | Google Scholar

Mancini, C., Falciati, L., Maioli, C., and Mirabella, G. (2022). Happy facial expressions impair inhibitory control with respect to fearful facial expressions but only when task-relevant. Emotion 22:142. doi: 10.1037/emo0001058

PubMed Abstract | Crossref Full Text | Google Scholar

Mirabella, G. (2018). The weight of emotions in decision-making: how fearful and happy facial stimuli modulate action readiness of goal-directed actions. Front. Psychol. 9:1334. doi: 10.3389/fpsyg.2018.01334

PubMed Abstract | Crossref Full Text | Google Scholar

Mirabella, G., Grassi, M., Mezzarobba, S., and Bernardis, P. (2022). Angry and happy expressions affect forward gait initiation only when task relevant. Emotion 23, 387–399. doi: 10.1037/emo0001112

PubMed Abstract | Crossref Full Text | Google Scholar

Pönkänen, L. M., and Hietanen, J. K. (2012). Eye contact with neutral and smiling faces: effects on autonomic responses and frontal EEG asymmetry. Front. Hum. Neurosci. 6:122. doi: 10.3389/fnhum.2012.00122

PubMed Abstract | Crossref Full Text | Google Scholar

Roelofs, K. (2017). Freeze for action: neurobiological mechanisms in animal and human freezing. Philos. Trans. Royal Soc. B 372:20160206. doi: 10.1098/rstb.2016.0206

PubMed Abstract | Crossref Full Text | Google Scholar

Roelofs, K., Hagenaars, M. A., and Stins, J. (2010). Facing freeze: social threat induces bodily freeze in humans. Psychol. Sci. 21, 1575–1581. doi: 10.1177/0956797610384746

PubMed Abstract | Crossref Full Text | Google Scholar

Said, C. P., Haxby, J. V., and Todorov, A. (2011). Brain systems for assessing the affective value of faces. Philos. Trans. Royal Soc. B 366, 1660–1670. doi: 10.1098/rstb.2010.0351

PubMed Abstract | Crossref Full Text | Google Scholar

Sander, D., Grandjean, D., Kaiser, S., Wehrle, T., and Scherer, K. R. (2007). Interaction effects of perceived gaze direction and dynamic facial expression: evidence for appraisal theories of emotion. Eur. J. Cogn. Psychol. 19, 470–480. doi: 10.1080/09541440600757426

Crossref Full Text | Google Scholar

Stins, J. F., and Beek, P. J. (2011). Organization of voluntary stepping in response to emotion-inducing pictures. Gait Post. 34, 164–168. doi: 10.1016/j.gaitpost.2011.04.002

PubMed Abstract | Crossref Full Text | Google Scholar

Stins, J. F., Roelofs, K., Villan, J., Kooijman, K., Hagenaars, M. A., and Beek, P. J. (2011a). Walk to me when i smile, step back when i'm angry: emotional faces modulate whole-body approach–avoidance behaviors. Exp. Brain Res. 212, 603–611. doi: 10.1007/s00221-011-2767-z

PubMed Abstract | Crossref Full Text | Google Scholar

Stins, J. F., Roerdink, M., and Beek, P. J. (2011b). To freeze or not to freeze? affective and cognitive perturbations have markedly different effects on postural control. Hum. Mov. Sci. 30, 190–202.

Google Scholar

Waller, B. M., and Micheletta, J. (2013). Facial expression in nonhuman animals. Emot. Rev. 5, 54–59.

Google Scholar

Wilkowski, B. M., and Meier, B. P. (2010). Bring it on: angry facial expressions potentiate approach-motivated motor behavior. J. Personal. Soc. Psychol. 98, 201–210. doi: 10.1037/a0017992

PubMed Abstract | Crossref Full Text | Google Scholar

Willis, M. L., Palermo, R., and Burke, D. (2011). Social judgments are influenced by both facial expression and direction of eye gaze. Soc. Cogn. 29, 415–429.

Google Scholar

Yiou, E., Gendre, M., Deroche, T., and Bozec, S. L. (2014). Influence of emotion on the biomechanical organization of backward and forward step initiation. Motor Contr. 18, 368–382. doi: 10.1123/mc.2013-0015

PubMed Abstract | Crossref Full Text | Google Scholar

Keywords: emotion, faces, gaze direction, gait initiation, quiet stance, posture, approach-avoidance motivation

Citation: Lebert A, Vilarroya O and Stins J (2024) Stepping into emotions: investigating the effect of angry and fearful faces on forward stepping and quiet stance. Front. Hum. Neurosci. 18:1411246. doi: 10.3389/fnhum.2024.1411246

Received: 02 April 2024; Accepted: 15 July 2024;
Published: 09 August 2024.

Edited by:

Naoya Hasegawa, Hokkaido University, Japan

Reviewed by:

Alessandro Botta, San Martino Hospital (IRCCS), Italy
Paolo Bernardis, University of Trieste, Italy

Copyright © 2024 Lebert, Vilarroya and Stins. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Angélique Lebert, angelique.lebert1@gmail.com

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.