
94% of researchers rate our articles as excellent or good
Learn more about the work of our research integrity team to safeguard the quality of each article we publish.
Find out more
EDITORIAL article
Front. Hum. Dyn., 08 April 2025
Sec. Digital Impacts
Volume 7 - 2025 | https://doi.org/10.3389/fhumd.2025.1585355
This article is part of the Research Topic#breakthebias: Working Towards Alternative Ways of Being in a Digital World Through Conversations With Critical Friends, Texts, and TechnologiesView all 5 articles
Editorial on the Research Topic
#breakthebias: working towards alternative ways of being in a digital world through conversations with critical friends, texts, and technologies
This Research Topic drew inspiration from the International Women's Day 2022 motto: #breakthebias. Its mission is to “advance gender parity in technology and celebrate the women forging innovation” by combating bias, stereotypes, and discrimination and creating a more diverse, equitable, and inclusive (tech) world where difference is valued. Indeed, scholars have long noted the androcentric bias in technology, which often marginalizes women and non-binary individuals. Technology remains associated with heteronormative masculinity, contributing to political, social, and economic gender inequities. While digital technologies thus have the potential to address these inequities, they can also reinforce existing stereotypes.
This Research Topic engages with these critical issues through dialogical and intersectional approaches. It brings together contributions from feminist researchers, designers, and/or technologists to explore their relation to “digital worlds critically.” We have invited contributors to critically engage with these themes, to explore ways in which we can be different in our digital or post-digital world and to examine multiple alternatives to existing technological and societal structures. The contributions explore both the promises and perils of technology, ultimately seeking to build pathways toward more inclusive futures.
The first article, Empathy and exclusion in the design process by Marsden and Wittwer, discusses the increasing emphasis on empathy as a central component of human-computer interaction (HCI) design. The authors argue that while empathy is celebrated as a way to enhance user-centered design, it often reinforces gender stereotypes and exclusionary practices: “Empathy will not bring the desired benefit to the design process if it is naively construed and understood as a feminine trait, if shortcuts are used to allegedly take the effort out of the empathic process, or if the social situation in which empathy is taking place is not considered.” The article emphasizes the importance of critically evaluating empathy-driven methodologies to ensure that they do not unintentionally marginalize users. This thought-provoking perspective calls for a nuanced approach to embedding empathy in design processes, one that recognizes and challenges gendered assumptions.
In the second article, Heterogeneity in making: Findings, approaches, and reflections on inclusivity in making and makerspaces by Fuchsberger et al., the authors examine the exclusionary practices prevalent in makerspaces. Despite their purported openness, these spaces are often dominated by young, white, educated men, leaving women and other under-represented groups feeling unwelcome. The authors share personal insights from a multi-year research project that involved interventions such as women-only workshops and redesigning makerspaces to be more inclusive. Their reports challenge the maker community to confront its biases and embrace heterogeneity as a core principle.
The third contribution, Navigating gender dynamics: A male researcher's experiences on conducting feminist HCI research by one of the editors, Ahmadi, offers a personal and reflexive account of conducting feminist research as a cisgender man. He reflects on his evolving understanding of gender and power dynamics through his participation in a feminist HCI project. The article highlights the importance of reflexivity, positionality, and the transformative potential of engaging with feminist literature. By sharing his challenges and growth, the author invites readers to critically consider how researchers' identities shape their engagement with feminist methodologies. His narrative contributes to ongoing dialogues about inclusivity and the role of men in feminist research.
The final article, A perspective on gender bias in generated text data by Hupperich examines how artificial intelligence (AI), a current topic, perpetuates gender stereotypes through biased text data and algorithms. The author explores methods for detecting and mitigating bias in generative models, emphasizing the societal implications of biased AI systems. By presenting case studies and proposing actionable solutions, this article bridges technical and ethical considerations. It calls for collaborative efforts between technologists, ethicists, and policymakers to ensure that AI systems reflect diverse perspectives and promote equity. This contribution underscores the urgent need to address bias in AI as part of the broader movement toward an inclusive digital future.
Taken together, these articles illuminate the complex interplay between technology, gender, and power across different fields of research. They highlight how digital technologies can both challenge and reinforce inequities within sociotechnical environments, depending on how they are researched, designed and implemented. By focusing on critical reflections, innovative interventions, and personal narratives, this Research Topic contributes to a growing body of work dedicated to disrupting bias in technology.
MA: Writing – original draft. DC: Writing – review & editing. NH: Writing – review & editing.
We thank our co-editors Angelika Strohmayer and Maryam Mustafa. We thank all contributing authors for their insightful work and the reviewers for their constructive feedback.
MA is co-founder and co-CEO of Turn2Talents GmbH (since September 2024).
The remaining authors declare no commercial or financial relationships that could be construed as a potential conflict of interest.
The author(s) declare that Gen AI was used in the creation of this manuscript. ChatGPT was used for assistance in generating some of the written content.
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
Keywords: gender equity, gendered technologies, feminist technologies, bias, stereotypes
Citation: Ahmadi M, de Castro Leal D and da Hora N (2025) Editorial: #breakthebias: working towards alternative ways of being in a digital world through conversations with critical friends, texts, and technologies. Front. Hum. Dyn. 7:1585355. doi: 10.3389/fhumd.2025.1585355
Received: 28 February 2025; Accepted: 25 March 2025;
Published: 08 April 2025.
Edited and reviewed by: Claire M. Mason, Commonwealth Scientific and Industrial Research Organisation (CSIRO), Australia
Copyright © 2025 Ahmadi, de Castro Leal and da Hora. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Michael Ahmadi, bWljaGEuYWhtYWRpQGdtYWlsLmNvbQ==
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.
Research integrity at Frontiers
Learn more about the work of our research integrity team to safeguard the quality of each article we publish.