A report I wrote as part of my postdoc in collaboration with The World Wide Web Foundation and Superbloom, featuring social media policy recommendations I co-designed with marginalised and/or impacted users, is coming out today.
About the report
What would platform governance look like if users affected by it could get a seat at the table to design the policies that will govern their social media experience? As a user with personal experiences of online abuse and de-platforming and as a platform governance researcher, I was interested in re-designing platform policies surrounding de-platforming, also known as account deletion, and malicious flagging, or the misuse of the reporting tool to silence specific accounts.
The report that I’m launching today features policy recommendations to tackle platform governance inequalities in the enforcement of flagging and de-platforming on social media. These recommendations are co-designed with users affected by these forms of governance – users from diverse communities of sex workers, nude content creators, sex educators, LGBTQIA+ accounts, brands, activists and journalists from different corners of the world.
Today’s report, which will later become an academic paper, represents the third stage of my postdoctoral research project I am leading at the Centre for Digital Citizens (CDC) and is backed up by a qualitative survey and interviews you, my followers, either participated in or helped me share, just like you shared my call for participation to the workshops which generated these policies. THANK YOU for all your help, work and support!
This report provides social media platforms with user-centred and research-informed recommendations to improve the design and effectiveness of their flagging and appeals tools. Research has shown these affordances to be inadequate at tackling online abuse, and to provide malicious actors with opportunities to exploit strict platform governance to de-platform users with whom they disagree. This has disproportionately affected marginalised users like sex workers, LGBTQIA+ and BIPOC users, nude and body-positive content creators, pole dancers, but also journalists and activists.
My research, and in particular a recent study with my boss Prof. Pam Briggs, has found that being de-platformed from social media often leaves users unable to access work opportunities, information, education as well as their communities and networks – which has devastating mental health and wellbeing impacts. Yet, in my research and personal interactions with social media companies I have learnt that, too often, the time, resources and attention allocated to engagement with the stakeholders who are directly affected by technology are awarded sparingly.
Contextualising the report
The idea underpinning this report is that content moderation often fails to take the human experience into account to prioritise speed and platform interests, lacking in the necessary empathy for users who are experiencing abuse, censorship, loss of livelihood and network as well as emotional distress. As a result, this report is a free resource for both users to feel seen in a governance process that often erases them and, crucially, for platform workers to avoid escaping stakeholder engagement in the drafting of their policies.
I had planned to release the report this weeks before the deletion of swathes of sex working, nude, kink and sex positive from Instagram in late June, and before my agent Helena from Lover Management started the #stopdeletingus protest with Klub Verboten. Nonetheless, given that users still know very little about those deletions, this report comes at an important time to continue asking social media platforms for fairness and transparency towards marginalised communities.
Study participants and policy co-designers
The policy recommendations in the report were co-designed with 45 end-users, who are too often ignored when drafting the rules governing the spaces they depend on for their social and work lives, in three workshops structured according to The World Wide Web Foundation’s Tech Policy Design Labs (TPDLs) playbook.
Similarly to previously documented experiences of censorship, women were over-represented among those who joined the workshops. Those who participated included both cis and transgender women and men, as well as non-binary people. Participants were both heterosexual and from the LGBTQIA+ community, and were located in and/or from a diverse set of locations: Europe (i.e., the United Kingdom, France, Germany); the United States; South America (i.e., Chile and Brazil); Asia (India, China, Hong Kong); and Africa (Ghana, Kenya, Nigeria, South Africa, Uganda).
With these communities, I wanted to reimagine content moderation policies outside of Big Tech, to challenge social media platforms to instil empathy into content moderation. Instead of turning them into passive research subjects, these users became co-designers and co-rule makers, to answer the questions:
- How might we protect users who think they are being censored and/or de-platformed because of mass reports?
- How might we help de-platformed users after they experience censorship and/or de-platforming?
Policy recommendations
The solutions participants proposed to tackle malicious flagging and de-platforming have a focus on improving transparency, safety and equality, education, design, fairness, due process, contextual knowledge, community support and sign-positing in platform governance. Many of these recommendations require action by platforms, but participants also shared recommendations for governments and civil society organisations to step in and help users affected by governance inequalities. These can be found in more detail in the full report, and largely revolve around the following three changes:
- Communication: Transparency and detailed information about content moderation educates and empowers users, improving due process.
- Recognising vulnerability: Allowing users to report themselves as protected categories can mitigate current platform governance inequalities, fast-tracking appeals and help in situations when users face abuse.
- Granular content controls: Making ALL content visible to everyone causes more governance issues than it solves. Empowering users to choose what they see through more granular content controls and periodic check-ins about the ability to opt in or out from certain posts can help mitigate this.
However, towards real, systemic change, users pushed for radical transparency and, crucially, for a duty of care by platform conglomerates, demanding information, workers’ rights, and compensation when platforms fall short of protecting their users from censorship and/or from abuse. Because of this, they found that current legislation and societal approaches fall short of protecting them: without recognising content creation and platform work as labour, and without recognising users have the right to express themselves sexually and work through their bodies, platform governance will continue replicating the offline inequalities it is marred by.
Download the full report
This report is a first step towards ensuring that the needs of users who are too often designed out of Big Tech spaces are heard and make it into the rooms where decisions about their online lives are made. Please share it far and wide.
For more information, contact: Carolina.are@northumbria.ac.uk.