AUTHOR=Robinson Jo , Thorn Pinar , McKay Samuel , Richards Hannah , Battersby-Coulter Rikki , Lamblin Michelle , Hemming Laura , La Sala Louise TITLE=The steps that young people and suicide prevention professionals think the social media industry and policymakers should take to improve online safety. A nested cross-sectional study within a Delphi consensus approach JOURNAL=Frontiers in Child and Adolescent Psychiatry VOLUME=2 YEAR=2023 URL=https://www.frontiersin.org/journals/child-and-adolescent-psychiatry/articles/10.3389/frcha.2023.1274263 DOI=10.3389/frcha.2023.1274263 ISSN=2813-4540 ABSTRACT=Introduction

Concerns exist about the relationship between social media and youth self-harm and suicide. Study aims were to examine the extent to which young people and suicide prevention professionals agreed on: (1) the utility of actions that social media companies currently take in response to self-harm and suicide-related content; and (2) further steps that the social media industry and policymakers could take to improve online safety.

Methods

This was a cross-sectional survey study nested within a larger Delphi expert consensus study. A systematic search of peer-reviewed and grey literature and roundtables with social media companies, policymakers, and young people informed the questionnaire development. Two expert panels were developed to participate in the overarching Delphi study, one of young people and one of suicide prevention experts; of them 43 young people and 23 professionals participated in the current study. The proportion of participants “strongly agreeing”, “somewhat agreeing”, “neither agreeing nor disagreeing”, and “somewhat disagreeing” or “strongly disagreeing” for each item were calculated; items that achieved =>80% of agreement from both panels were strongly endorsed.

Results

There was limited consensus across the two groups regarding the utility of the safety strategies currently employed by companies. However, both groups largely agreed that self-harm and suicide-related content should be restricted. Both groups also agreed that companies should have clear policies covering content promoting self-harm or suicide, graphic depictions of self-harm or suicide, and games, pacts and hoaxes. There was moderate agreement that companies should use artificial intelligence to send resources to users at risk. Just over half of professionals and just under half of young people agreed that social media companies should be regulated by government. There was strong support for governments to require schools to educate students on safe online communication. There was also strong support for international collaboration to better coordinate efforts.

Discussion

Study findings reflect the complexity associated with trying to minimise the risks of communicating online about self-harm or suicide whilst capitalising on the benefits. However, a clear message was the need for better collaboration between policymakers and the social media industry and between government and its international counterparts