Skip to main content

PERSPECTIVE article

Front. Commun.
Sec. Culture and Communication
Volume 9 - 2024 | doi: 10.3389/fcomm.2024.1385869
This article is part of the Research Topic Feminist Fabulations in Algorithmic Empires View all 7 articles

Algorithmic agency and 'fighting back' against discriminatory Instagram content moderation: #IWantToSeeNyome

Provisionally accepted
  • University of Amsterdam, Amsterdam, Netherlands

The final, formatted version of the article will be published soon.

    Instagram influencers of marginalized identities and subjectivities, for example those that are plus sized or people of colour, often express that their content is moderated more heavily and will sometimes place blame on the ‘the algorithm’ for their feelings of discrimination. Though biases online are reflective of discrimination in society at large, these biases are co-constituted through algorithmic and human processes and the entanglement of these processes in enacting discriminatory content removals should be taken seriously. These influencers who are more likely to have their content removed, have to learn how to play ‘the algorithm game’ (Cotter, 2019) to remain visible, creating a conflicting discussion around agentic flows which dictates not only their Instagram use, but more broadly, how creators might feel about their bodies in relation to societal standards of ‘acceptability’. In this paper I present the #IWantToSeeNyome campaign as a case study example which contextualizes some of the experiences of marginalized influencers who feel content moderation affects their attachments to their content. Through a lens of algorithmic agency, I think through the contrasting alignments between freedom of expression and normative representation of bodies in public space. The Instagram assemblage of content moderation, presents a lens with which to view this issue and highlights the contrast between content making, user agency, and the ways more-than-human processes can affect human feelings about bodies and where they do and don’t belong.

    Keywords: Content moderation, Feminism, gender, race, Algorithms, agency, instagram, Social Media

    Received: 13 Feb 2024; Accepted: 26 Aug 2024.

    Copyright: © 2024 Willcox. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

    * Correspondence: Marissa Willcox, University of Amsterdam, Amsterdam, Netherlands

    Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.