
94% of researchers rate our articles as excellent or good
Learn more about the work of our research integrity team to safeguard the quality of each article we publish.
Find out more
EDITORIAL article
Front. Hum. Dyn.
Sec. Digital Impacts
Volume 7 - 2025 | doi: 10.3389/fhumd.2025.1585355
This article is part of the Research Topic #breakthebias: Working Towards Alternative Ways of Being in a Digital World Through Conversations With Critical Friends, Texts, and Technologies View all 5 articles
The final, formatted version of the article will be published soon.
You have multiple emails registered with Frontiers:
Please enter your email address:
If you already have an account, please login
You don't have a Frontiers account ? You can register here
Editorial: #breakthebias: Working Towards Alternative Ways of Being in a Digital World Through Conversations With Critical Friends, Texts, and TechnologiesMichael Ahmadi1, Débora de Castro Leal², Nina da Hora³1Turn2Talents GmbH, Cologne, Germany, former University of Siegen, Siegen, Germany ² Federal University of Pará, Pará, Brazil, former University of Siegen, Siegen, Germany³ State University of Campinas, Campinas, Brazil* Correspondence: Corresponding Authoremail@uni.eduKeywords: Gender Equity, Gendered Technologies, Feminist Technologies, Bias, Stereotypes This Research Topic drew inspiration from the motto of International Women’s Day 2022: #breakthebias. Its mission is to “advance gender parity in technology and celebrate the women forging innovation” by tackling bias, stereotypes, discrimination and creating a more diverse, equitable, and inclusive (tech) world where difference is valued. Indeed, scholars have long noted the androcentric bias in technology, which often marginalizes women and non-binary individuals. Technology remains associated with heteronormative masculinity, contributing to political, social, and economic gender inequities. While digital technologies thus have the potential to address these inequities, they can also reinforce existing stereotypes. This Research Topic engages with these critical issues through dialogical and intersectional approaches. It brings together contributions from feminist researchers, designers, and/or technologists to explore their relation to “digital worlds critically.” We invited contributors to engage with these themes critically, to explore ways in which we can be different in our digital or post-digital world and to examine multiple alternatives to existing technological and societal structures. The contributions explore both the promises and perils of technology, ultimately seeking to build pathways toward more inclusive futures. The first article, Empathy and exclusion in the design process by Nicola Marsden and Alexander Wittwer, discusses the increasing emphasis on empathy as a central component in human-computer interaction (HCI) design. The authors argue that while empathy is celebrated as a way to enhance user-centered design, it often reinforces gender stereotypes and exclusionary practices: “Empathy will not bring the desired benefit to the design process if it is naively construed and understood as a feminine trait, if shortcuts are used to allegedly take the effort out of the empathic process, or if the social situation in which empathy is taking place is not considered.” The article emphasizes the importance of critically evaluating empathy-driven methodologies to ensure they do not unintentionally marginalize users. This thought-provoking perspective calls for a nuanced approach to embedding empathy in design processes, one that recognizes and challenges gendered assumptions.In the second article, Heterogeneity in making: Findings, approaches, and reflections on inclusivity in making and makerspaces by Verena Fuchsberger et al., the authors examine the exclusionary practices prevalent in makerspaces. Despite their purported openness, these spaces are often dominated by young, white, educated men, leaving women and other underrepresented groups feeling unwelcome. The authors share personal insights from a multi-year research project that involved interventions such as women-only workshops and redesigning makerspaces to be more inclusive. Their reports challenge the maker community to confront its biases and embrace heterogeneity as a core principle.The third contribution, Navigating gender dynamics: A male researcher’s experiences on conducting feminist HCI research by one of the editors, Michael Ahmadi, offers a personal and reflexive account of conducting feminist research as a cisgender man. He reflects on his evolving understanding of gender and power dynamics through his participation in a feminist HCI project. The article highlights the importance of reflexivity, positionality, and the transformative potential of engaging with feminist literature. By sharing his challenges and growth, the author invites readers to critically consider how researchers’ identities shape their engagement with feminist methodologies. His narrative contributes to ongoing dialogues about inclusivity and the role of men in feminist research.The final article, A perspective on gender bias in generated text data by Thomas Hupperich examines how artificial intelligence (AI), a timely topic, perpetuates gender stereotypes through biased text data and algorithms. The author explores methods for detecting and mitigating bias in generative models, emphasizing the societal implications of biased AI systems. By presenting case studies and proposing actionable solutions, this article bridges technical and ethical considerations. It calls for a collaborative effort between technologists, ethicists, and policymakers to ensure that AI systems reflect diverse perspectives and promote equity. This contribution underscores the urgent need to address bias in AI as part of the broader movement toward inclusive digital futures.Together, these articles illuminate the complex interplay between technology, gender, and power across different research fields. They highlight how digital technologies, within sociotechnical environments, can both challenge and reinforce inequities, depending on how they are researched, designed and implemented. By focusing on critical reflections, innovative interventions, and personal narratives, this Research Topic contributes to a growing body of work dedicated to breaking the bias in technology. Author Contributions and AcknowledgmentsMichael Ahmadi: Conceptualization and writing of the original draft. Debora de Castro Leal & Nina da Hora: Review, feedback, and final approval.We thank our co-editors Angelika Strohmayer & Maryam Mustafa. ChatGPT was used for assistance in generating some of the written content. We thank all contributing authors for their insightful work and the reviewers for their constructive feedback. Conflict of Interest StatementThe authors declare no commercial or financial relationships that could be construed as a potential conflict of interest.
Keywords: Gender equity, Gendered technologies, Feminist Technologies, Bias, stereotypes
Received: 28 Feb 2025; Accepted: 25 Mar 2025.
Copyright: © 2025 Ahmadi, De Castro Leal and Da Hora. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
* Correspondence:
Michael Ahmadi, University of Siegen, Siegen, Germany
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.
Research integrity at Frontiers
Learn more about the work of our research integrity team to safeguard the quality of each article we publish.