AUTHOR=Willcox Marissa TITLE=Algorithmic agency and “fighting back” against discriminatory Instagram content moderation: #IWantToSeeNyome JOURNAL=Frontiers in Communication VOLUME=9 YEAR=2025 URL=https://www.frontiersin.org/journals/communication/articles/10.3389/fcomm.2024.1385869 DOI=10.3389/fcomm.2024.1385869 ISSN=2297-900X ABSTRACT=
Instagram influencers of marginalized identities and subjectivities, for example those that are plus sized or people of color, often express that their content is moderated more heavily and will sometimes place blame on the “the algorithm” for their feelings of discrimination. Though biases online are reflective of discrimination in society at large, these biases are co-constituted through algorithmic and human processes and the entanglement of these processes in enacting discriminatory content removals should be taken seriously. These influencers who are more likely to have their content removed, have to learn how to play “the algorithm game” to remain visible, creating a conflicting discussion around agentic flows which dictates not only their Instagram use, but more broadly, how creators might feel about their bodies in relation to societal standards of “acceptability.” In this paper I present the #IWantToSeeNyome campaign as a case study example which contextualizes some of the experiences of marginalized influencers who feel content moderation affects their attachments to their content. Through a lens of algorithmic agency, I think through the contrasting alignments between freedom of expression and normative representation of bodies in public space. The Instagram assemblage of content moderation, presents a lens with which to view this issue and highlights the contrast between content making, user agency, and the ways more-than-human processes can affect human feelings about bodies and where they do and do not belong.