- Experimental Psychology and Cognitive Science, Department of Psychology, Justus Liebig University, Gießen, Germany
1 Introduction
In contemporary cognitive science, the interplay between emotions and information processing is crucial, especially in the context of media and misinformation. Previous research has extensively explored socio-affective factors influencing information processing and suggests that our cognitive and emotional capacities are not solely internal but are scaffolded by dynamic interactions with our material and social environments (Sterelny, 2010; Colombetti and Krueger, 2015). A particular emphasis in recent research focuses on cognitive drivers for the acceptance of false information (Ecker et al., 2022). Dual-process theories, such as motivated cognition (Kahan, 2017) and classical reasoning, provide foundational frameworks for understanding the role of analytical thinking in shaping beliefs (Pennycook and Rand, 2019). Additionally, the influence of emotions and framing of news on information processing has been a focal point, with studies showcasing the significant impact of emotional language on belief formation and sharing of news (Martel et al., 2020; Roozenbeek et al., 2022). Building upon the existing literature, this opinion article expands on the role of emotional cues and their influence on cognitive mechanisms. Within this framework, this paper introduces the concept of “Inaccuracy Prompts”. These prompts, akin to their accuracy counterparts, are posited to sway individuals toward less critical thinking, thus contributing to the acceptance of misinformation.
2 The interplay of emotions and rational decision-making systems
Rational decision-making, traditionally seen as separate from emotion and deeply rooted in Western philosophy, as advocated by figures like Aristotle, who advocated for a strict separation between emotion and rational action. Decision-making involves weighing alternatives, beliefs about outcomes, and values, yet conventional research often simplifies the subjective, emotional experiences of decision-makers, ignoring the complexity inherent in real-world decisions (Strle, 2016). Modern cognitive research refutes the strict separation of emotion and rationality, demonstrating that emotions are essential to rational processes and that no decision can occur without emotional involvement (Damásio, 2001; Bonansinga, 2020). Empirical evidence suggests that effective decision-making is not merely a product of logical deliberation but also relies on the intricate interplay of emotional insights, as emotions emerge from the complex interplay of more basic psychological ingredients such as core affect and conceptualization, which integrate bodily sensations and context-dependent interpretations to form discrete emotions (Lindquist, 2013). Kahneman (2011) was one of the most popular proponents of this theory. He divided our cognitive processes into two systems with regards to how we process and evaluate information and subsequently make our decisions: System 1, which is fast, intuitive, emotion-based, and unconscious, and System 2, which is slow, deliberate, controlled, and conscious (i.e., rational). Dual-process theories tend to view emotion and intuition on the one hand, and logic and reason on the other, as dichotomous opposites. This view is now controversial and is challenged by the enactive approach which emphasizes that perception and action are deeply integrated, where the mind is not merely reacting to but is actively shaped by its interactions within an environment (Colombetti, 2007). The Affect as Information Theory (AIT), as well assumes an interdependent connection between cognition and emotion (Clore et al., 2001; Clore and Huntsinger, 2007). Gigerenzer (2008) further enriches this perspective by illustrating that simple heuristics, often stemming from intuitive processes, can outperform complex cognitive operations, particularly under uncertainty. Affective Intelligence postulates that emotions, particularly feelings of (in)security, give us feedback about unconscious processes and play a role in both intuitive and deliberative judgment (Marcus et al., 2019). Furthermore, AIT posits that increased anxiety favors explicit reasoning in uncertain situations, while the absence of anxiety suggests reliance on habitual decisions in familiar contexts (MacKuen et al., 2007; Marcus et al., 2019). The concept of the scaffolded mind posits as well that our mental processes are extensively supported not just internally but through our continuous interaction with the physical and social environment (Sterelny, 2010; Colombetti and Krueger, 2015). It highlights how emotions and rational systems are interwoven and the significant role our surroundings play in shaping cognitive and emotional outputs.
In short, human judgments can easily be influenced by emotional factors, even if they are not necessarily related to the current situation.
3 Emotional language and framing in news media and political discourse
The following section examines the pivotal role of emotional language and framing within news media and political discourse, and how these elements influence public perception and the spread of political misinformation. Emotions are not just reactionary; they are instrumental in shaping our judgments and perceptions, particularly in the context of political information that often employs emotional triggers to influence and mislead (Martel et al., 2020).
Rather than consciously querying their feelings, individuals inherently integrate affectivity within the context of decision-making, as it naturally influences the evaluation of information (Damásio, 2001; Schwarz, 2012). Affective feelings, both positive and negative, have a variable influence on cognitive processing styles available (e.g., heuristic versus systematic; Huntsinger and Ray, 2016). Drawing on the concept of the scaffolded mind (Colombetti and Krueger, 2015), the role of environmental scaffolding can be applied to understand how news media utilize emotional language to create affective niches that manipulate public perception. These niches, strategically amplify specific emotional responses that can facilitate the spread of misinformation. This scaffolding of the affective mind by media channels serves not only to inform but to evoke and sustain particular emotional states that can skew rational decision-making processes. Considering Roozenbeek et al. (2022), emotional language significantly increases the likelihood of misinformation being shared and believed, demonstrating why such language is identified as a key manipulation technique in the spread of false information. Compared to neutral content, emotionally charged information attracts our attention more strongly and can have a distorting effect on our perception (Zajonc, 1984; Schwarz, 2012; Ecker et al., 2022). A popular example are short-form video platforms like TikTok. It attracts our attention more strongly and immediately exposes viewers to provocative content that enhances emotional responses and engagement (Cheng and Li, 2024). Anger in particular spreads faster than any other emotion on social media platforms (SMP) and contributes to the virality of fake news (Fan et al., 2014; Corbu et al., 2021; Michel and Gandon, 2024) and due to the negativity bias, negative information leave a stronger memory trace than positive information (Courbet et al., 2014). In combination with recommendation algorithms on SMPs, negative emotions provide fertile ground for the spread of false information in particular (Roozenbeek et al., 2022; Michel and Gandon, 2024).
In the political realm, disinformation campaigns use emotionally charged language to capture attention, reinforce messages, and provoke specific reactions, aiming to deepen emotional engagement and influence perceptions and responses to content more intensely (Corbu et al., 2021). One tool to achieve this is to frame information differently. Frames can roughly be divided into two categories: thematic and episodic (Gross, 2008; Aarøe, 2011). Thematic frames focus on presenting political issues within a broader context, offering abstract and general evidence. In contrast, episodic frames illuminate issues through concrete events and particular cases, providing specific characters at which emotional reactions can be directed (Gross, 2008; Aarøe, 2011). Episodic frames not only trigger a stronger emotional response. These emotions are also suitable for directing the impact of this emotion into support for an argued policy position (Aarøe, 2011).
Research shows the effectiveness of this kind of emotional manipulation in spreading misinformation. For instance, Peters et al. (2009) found that participants were more willing to share social anecdotes that aroused interest, surprise, disgust, and happiness with an unspecified audience. Conversely, those with low emotional reactivity were more inclined to overlook or distance themselves from the propagation of false information (Horner et al., 2021). These effects are notably significant for politically charged topics traditionally associated with emotive media language, often sparking public debates and controversies.
4 Call for research on the risk of active consideration as an Inaccuracy Prompt
Influential information sources like news media and politicians play a crucial role in shaping public perception. Politicians often demand of each other to “be aware” of certain things in their debates, meaning to “be attentive,” “to engage,” and to “actively consider.” Therefore, we will refer to active consideration as a form of attentiveness, engagement, and critical evaluation. The call to focus our awareness and attention on certain situations and information seems to make sense, since current political disinformation campaigns are an increasing concern (Lewandowsky et al., 2020). In fact, prompting people to stop and think about an issue at hand is a popular approach in current science with the aim of counteracting the spread of misinformation.
One way to achieve this is using so-called Accuracy Prompts (APs; Pennycook and Rand, 2021). APs are a nudge intervention technique aimed at making individuals aware of the concept of accuracy in news headlines, thereby boosting their ability to differentiate true from false information and reducing their susceptibility to believing and sharing false information (Pennycook et al., 2023). APs are therefore cues for users to enhance their critical thinking (e.g., assess the accuracy of information Pennycook and Rand, 2022; Capraro and Celadin, 2023) and intend to cause a reader to shift from superficial, intuitive System 1 thinking to critically reflective System 2 thinking (Evans and Stanovich, 2013; Pennycook and Rand, 2019). Or, to put it another way, APs encourage people to be more aware about an issue at hand. Although the general effectiveness of APs has been widely documented (Bhardwaj et al., 2023; Pennycook et al., 2023), we found that there is no explicit research on whether and how the impact of APs is influenced by the framing of messages and the associated emotional context.
In general, APs aim to guide people from System 1 toward deliberate System 2 thinking. However, we theorize that expressions like “we need to be aware of the fact that...” might reverse this effect and instead draw the attention of individuals to the emotional backdrop of presented information, thereby turning an AP into something we coin as Inaccuracy Prompt (IAP). By our definition, IAPs share the same goal as APs but fail to guide the reader's attention to the concept of accuracy and instead boost their awareness of their own affective response to news, influenced by the emotional context and framing in which the information is presented. In contrast to APs, IAPs are not always designed as explicit interventions and can occur by accident. In summary, prompts aimed to enhance accuracy could function as both Accuracy- and Inaccuracy cues, depending on framing and elicited emotions. Specifically, when APs explicitly call for active consideration in an emotionally charged, episodically framed context, they may amplify emotional salience, potentially promoting the spread of disinformation. Emotions, especially anger and enthusiasm, mediate framing effects and foster reliance on heuristics (Bodenhausen et al., 1994; Marcus et al., 2000; Lecheler et al., 2013). Anger in particular promotes the tendency to believe politically motivated misinformation (Weeks, 2015).
The challenge is that while people are encouraged to think critically (what is being said), the AP might draw attention to the emotionally loaded context in which these statements are embedded – reflecting how our emotional and cognitive capacities are not merely internal but are shaped by our interactions with both material culture and social relationships; i.e., how it is being said (Figure 1).
Figure 1. Whether the evaluation or response to information relies more on facts or emotions depends on the framing in which the relevant information is embedded.
We posit that this might cause two possible effects: (1) masking the strength of emotional influences or, if you consider the activating properties of anger (which populists, for example, like to exploit) (2) act as some sort of magnifying glass, causing people to double down on their opinions. In any case, these concerns motivate our call for research on possible interactions between APs, framing and discrete emotions.1
Another reason for our assumption is that urging individuals to engage in more active or considered thought may paradoxically result in lower-quality decision-making (Dijksterhuis and Nordgren, 2006). Researchers recognize the challenges in decision-making due to the limited capacity of conscious processing, which can lead to suboptimal choices (Tversky and Kahneman, 1974; Kahneman, 2003). Considered thought, intricately linked to attention and the resulting decisions, confronts constraints imposed by the limited capacity of verbal working memory (Baddeley, 1992), which can only temporarily “store” approximately four information units (Wilhelm et al., 2013). Particularly considering the increasing “infodemic”, facilitated by the internet (Corbu et al., 2021; Bortolotti, 2023), considered thought consequently can only focus on a subset of the information that should be accounted for, potentially leading to suboptimal decisions (Simon, 1955; Tversky and Kahneman, 1974; Kahneman, 2003). This assumption is supported by experimental studies that investigated the quality of the decisions people made when they had to make either conscious or unconscious decisions about complex issues. Compared with people who thought less, people who engaged in this consciousness by necessity (Dijksterhuis and Nordgren, 2006) made less accurate evaluations, suggesting that conscious (i.e. active or considered) thought led them to focus on a limited number of attributes at the expense of taking into account other relevant attributes.
The possible negative influence of conscious thought on decisions does not necessarily contradict the AP-approach. Rather, this data illustrates the importance to focus our limited awareness capacities on important core aspects (i.e., facts) of controversial issues in contrast to the emotional load they might be embedded in. In summary, both the influence of emotions and framing effects on news perception and the dissemination of misinformation, as well as APs as an intervention against the latter, are subject of current research. However, in our view, there is a data gap regarding the question of whether and how an emotional context could potentially influence, negate, or even inverse the effect of accuracy prompts (i.e., IAP); a gap we would like to address in future research.
5 Quo Vadis?
Examining emotional language and framing in news media and political discourse highlights the powerful role emotions play in shaping perceptions, especially in the context of political fake news and disinformation campaigns, challenging the traditional dichotomy between emotions and rationality. Considering the rampant misinformation and fake news prevalent in today's information landscape, understanding the intricate interactions between accuracy prompts, framing, and emotions becomes paramount. Negative emotions like anger not only attract attention but also reinforce cognitive biases, contributing to the spread of fake news (Martel et al., 2020; Corbu et al., 2021). The construction of filter bubbles by recommendation systems further amplifies this effect, trapping users into echo chambers of emotional states (Fan et al., 2014; Corbu et al., 2021; Chuai et al., 2023; Michel and Gandon, 2024). While APs aim to direct attention toward accuracy, the influence of emotions on engagement and credibility assessment remains a critical research gap (Martel et al., 2020). Understanding these dynamics is essential for addressing the spread of misinformation and promoting critical thinking.
Building on previous research, we suggest investigating the notion of IAPs. Specifically, we theorize that calling for active considerations, when immersed in an emotionally charged environment, might unintentionally steer individuals toward less critical thinking, potentially fostering the acceptance of misinformation.
The discussion on the limited capacity of information processing, with working memory being the “bottle neck,” highlights the need to focus on essential aspects of issues within emotional contexts (Tversky and Kahneman, 1974; Baddeley, 1992; Kahneman, 2003; Wilhelm et al., 2013). Complementing this, Oblak et al. (2024) suggest that understanding background feelings and transmodal dynamics during working memory tasks can elucidate variations in performance and overall conscious experience. Investigating the dynamics of how emotional cues, consciousness, and information prompts interact may provide essential insights into the dynamics of information dissemination and user engagement on SMPs and thereby shaping public perception and decision-making.
The potential dual function of prompts, contingent upon contextual factors, introduces complexity to information dissemination strategies. Ultimately, this exploration could contribute to the ongoing discourse on promoting critical thinking and improving decision-making quality within emotionally charged information environments.
Author contributions
SJ: Visualization, Writing – original draft, Writing – review & editing. KH: Supervision, Writing – review & editing.
Funding
The author(s) declare financial support was received for the research, authorship, and/or publication of this article. This manuscript was supported by the open access fund of the Justus Liebig University.
Acknowledgments
We thank the reviewers and the editor for their valuable comments and ideas on the manuscript.
Conflict of interest
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Publisher's note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
Footnotes
1. ^In the context of this paper we define distinct emotions as the basic emotions described by Ekman and Cordaro (2011) which encompass Happiness, Sadness, Fear, Anger, Disgust, Contempt, and Surprise.
References
Aarøe, L. (2011). Investigating frame strength: The case of episodic and thematic frames. Polit. Commun. 28, 207–226. doi: 10.1080/10584609.2011.568041
Bhardwaj, V., Martel, C., and Rand, D. G. (2023). Examining Accuracy-Prompt Efficacy in Combination With Using Colored Borders to Differentiate News and Social Content Online. Harvard, MA: Harvard Kennedy School Misinformation Review.
Bodenhausen, G. V., Sheppard, L. A., and Kramer, G. P. (1994). Negative affect and social judgment: the differential impact of anger and sadness. Eur. J. Soc. Psychol. 24, 45–62. doi: 10.1002/ejsp.2420240104
Bonansinga, D. (2020). Who thinks, feels. The relationship between emotions, politics and populism. Partecipazione e Conflitto 13, 83–106. doi: 10.1285/i20356609v13i1p83
Capraro, V., and Celadin, T. (2023). “I think this news is accurate”: endorsing accuracy decreases the sharing of fake news and increases the sharing of real news. Pers. Soc. Psychol. Bull. 49, 1635–1645. doi: 10.1177/01461672221117691
Cheng, Z., and Li, Y. (2024). Like, comment, and share on TikTok: Exploring the effect of sentiment and second-person view on the user engagement with TikTok news videos. Soc. Sci. Comput. Rev. 42, 201–223. doi: 10.1177/08944393231178603
Chuai, Y., Rossi, A., and Lenzini, G. (2023). “Using emotions and topics to understand online misinformation,” in Int. Conf. Web Eng. (Cham: Springer), 395–400. doi: 10.1007/978-3-031-34444-2_34
Clore, G. L., Gasper, K., and Garvin, E. (2001). Affect as information. Handb. Affect Soc. Cogn. 22, 121–144.
Clore, G. L., and Huntsinger, J. R. (2007). How emotions inform judgment and regulate thought. TiCS 11, 393–399. doi: 10.1016/j.tics.2007.08.005
Colombetti, G. (2007). Enactive appraisal. Phenomenol. Cogn. Sci. 6, 527–546. doi: 10.1007/s11097-007-9077-8
Colombetti, G., and Krueger, J. (2015). Scaffoldings of the affective mind. Philos. Psychol. 28, 1157–1176. doi: 10.1080/09515089.2014.976334
Corbu, N., Bârgăoanu, A., Durach, F., and Udrea, G. (2021). Fake news going viral: the mediating effect of negative emotions. Media Lit. Acad. Res. 4, 58–87.
Courbet, D., Fourquet-Courbet, M. P., Kazan, R., and Intartaglia, J. (2014). The long-term effects of e-advertising: the influence of internet pop-ups viewed at a low level of attention in implicit memory. J. Comput. Mediat. Commun. 19, 274–293. doi: 10.1111/jcc4.12035
Damásio, A. R. (2001). Emotion and the human brain. Ann. N. Y. Acad. Sci. 935, 101–106. doi: 10.1111/j.1749-6632.2001.tb03475.x
Dijksterhuis, A., and Nordgren, L. F. (2006). A theory of unconscious thought. Perspect. Psychol. Sci. 1, 95–109. doi: 10.1111/j.1745-6916.2006.00007.x
Ecker, U. K., Lewandowsky, S., Cook, J., Schmid, P., Fazio, L. K., Brashier, N., et al. (2022). The psychological drivers of misinformation belief and its resistance to correction. Nat. Rev. Psychol. 1, 13–29. doi: 10.1038/s44159-021-00006-y
Ekman, P., and Cordaro, D. (2011). What is meant by calling emotions basic. Emot. Rev. 3, 364–370. doi: 10.1177/1754073911410740
Evans, J., and Stanovich, K. E. (2013). Dual-process theories of higher cognition: advancing the debate. Perspect. Psychol. Sci. 8, 223–241. doi: 10.1177/1745691612460685
Fan, R., Zhao, J., Chen, Y., and Xu, K. (2014). Anger is more influential than joy: Sentiment correlation in Weibo. PloS ONE 9:e110184. doi: 10.1371/journal.pone.0110184
Gross, K. (2008). Framing persuasive appeals: episodic and thematic framing, emotional response, and policy opinion. Polit. Psychol. 29, 169–192. doi: 10.1111/j.1467-9221.2008.00622.x
Horner, C. G., Galletta, D., Crawford, J., and Shirsat, A. (2021). Emotions: the unexplored fuel of fake news on social media. J. Manag. Inf. Syst. 38, 1039–1066. doi: 10.1080/07421222.2021.1990610
Huntsinger, J. R., and Ray, C. (2016). A flexible influence of affective feelings on creative and analytic performance. Emotion 16, 826–837. doi: 10.1037/emo0000188
Kahan, D. M. (2017). Misconceptions, misinformation, and the logic of identity-protective cognition. Cult. Cogn. Proj. Work. Pap. Ser., No. 164. Yale Law Sch., Pub. Law Res. Pap., No. 605. Yale Law & Econ. Res. Pap., No. 575. doi: 10.2139/ssrn.2973067
Kahneman, D. (2003). A perspective on judgment and choice: mapping bounded rationality. Am. Psychol. 58, 697–720. doi: 10.1037/0003-066X.58.9.697
Lecheler, S., Schuck, A. R., and De Vreese, C. H. (2013). Dealing with feelings: Positive and negative discrete emotions as mediators of news framing effects. Commun.-Eur. J. Commun. Res. 38, 189–209. doi: 10.1515/commun-2013-0011
Lewandowsky, S., Cook, J., Ecker, U., Albarracín, D., Kendeou, P., Newman, E. J., et al. (2020). The Debunking Handbook 2020, 19. doi: 10.17910/b7.1182
Lindquist, K. A. (2013). Emotions emerge from more basic psychological ingredients: a modern psychological constructionist model. Emot. Rev. 5, 356–368. doi: 10.1177/1754073913489750
MacKuen, M., Marcus, G. E., Neuman, W. R., and Keele, L. (2007). “The third way: the theory of affective intelligence and American democracy,” in The Affect Effect: Dynamics of Emotion in Political Thinking and Behavior, 124–151. doi: 10.7208/chicago/9780226574431.003.0006
Marcus, G. E., Neuman, W. R., and MacKuen, M. (2000). Affective Intelligence and Political Judgment. Chicago, IL: University of Chicago Press.
Marcus, G. E., Valentino, N. A., Vasilopoulos, P., and Foucault, M. (2019). Applying the theory of affective intelligence to support for authoritarian policies and parties. Polit. Psychol. 40, 109–139. doi: 10.1111/pops.12571
Martel, C., Pennycook, G., and Rand, D. G. (2020). Reliance on emotion promotes belief in fake news. Cogn. Res. Princ. Implic. 5, 120. doi: 10.1186/s41235-020-00252-3
Michel, F., and Gandon, F. (2024). Pay attention: a call to regulate the attention market and prevent algorithmic emotional governance. arXiv preprint arXiv:2402.16670.
Oblak, A., Dragan, O., Slana Ozimič, A., Kordeš, U., Purg, N., Bon, J., et al. (2024). What is it like to do a visuo-spatial working memory task: a qualitative phenomenological study of the visual span task. Conscious. Cogn. 118:103628. doi: 10.1016/j.concog.2023.103628
Pennycook, G., Bhargava, P., Cole, R., Goldberg, B., Lewandowsky, S., Rand, D., et al. (2023). Misinformation inoculations must be boosted by accuracy prompts to improve judgments of truth. PsyArxiv. doi: 10.31234/osf.io/5a9xq
Pennycook, G., and Rand, D. G. (2019). Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition 188, 39–50. doi: 10.1016/j.cognition.2018.06.011
Pennycook, G., and Rand, D. G. (2021). The psychology of fake news. Trends Cog. Sci. 25, 388–402. doi: 10.1016/j.tics.2021.02.007
Pennycook, G., and Rand, D. G. (2022). Accuracy prompts are a replicable and generalizable approach for reducing the spread of misinformation. Nat. Commun. 13:2333. doi: 10.1038/s41467-022-30073-5
Peters, K., Kashima, Y., and Clark, A. (2009). Talking about others: emotionality and the dissemination of social information. Eur. J. Soc. Psychol. 39, 207–222. doi: 10.1002/ejsp.523
Roozenbeek, J., Van Der Linden, S., Goldberg, B., Rathje, S., and Lewandowsky, S. (2022). Psychological inoculation improves resilience against misinformation on social media. Sci. Adv. 8:eabo6254. doi: 10.1126/sciadv.abo6254
Schwarz, N. (2012). Feelings-as-information theory. Handb. Theor. Soc. Psychol. 1, 289–308. doi: 10.4135/9781446249215.n15
Simon, H. A. (1955). A behavioral model of rational choice. Q. J. Econ. 69, 99–118. doi: 10.2307/1884852
Sterelny, K. (2010). Minds: extended or scaffolded?. Phenomenol. Cogn. Sci. 9, 465–481. doi: 10.1007/s11097-010-9174-y
Tversky, A., and Kahneman, D. (1974). Judgment under uncertainty: heuristics and biases. Science 185, 1124–1131. doi: 10.1126/science.185.4157.1124
Weeks, B. E. (2015). Emotions, partisanship, and misperceptions: How anger and anxiety moderate the effect of partisan bias on susceptibility to political misinformation. J. Commun. 65, 699–719. doi: 10.1111/jcom.12164
Wilhelm, O., Hildebrandt, A., and Oberauer, K. (2013). What is working memory capacity, and how can we measure it? Front. Psychol. 4:433. doi: 10.3389/fpsyg.2013.00433
Keywords: accuracy prompts, Inaccuracy Prompts, news framing, disinformation, emotional cues, fake news, consciousness, dual process theory
Citation: Jakob S and Hamburger K (2024) Active consideration in an emotional context: implications for information processing. Front. Psychol. 15:1367714. doi: 10.3389/fpsyg.2024.1367714
Received: 09 January 2024; Accepted: 05 June 2024;
Published: 20 June 2024.
Edited by:
Kevin Gluck, Florida Institute for Human and Machine Cognition, United StatesReviewed by:
Aleš Oblak, University Psychiatric Clinic, SloveniaCopyright © 2024 Jakob and Hamburger. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Kai Hamburger, a2FpLmhhbWJ1cmdlciYjeDAwMDQwO3BzeWNob2wudW5pLWdpZXNzZW4uZGU=; Sophie Jakob, c29waGllLmguamFrb2ImI3gwMDA0MDtwc3ljaG9sLnVuaS1naWVzc2VuLmRl