Petition update: sharing community experiences of moderation with big tech

After Instagram announced updates to their Terms of Use in November 2020, a petition I started with a variety of accounts posting content related to nudity, sexuality and performance was signed by over 100,000 people. As a result of the petition and of connections via academic circles, I was able to speak to members of Facebook/Instagram’s policy teams to relay users’ concerns about the moderation of nudity and sexuality on the platform. This blog post is an update on the petition and moderation of nudity and sexuality since these conversations started.

Communications with Facebook / Instagram following the ToS Petition

This update on the state of moderation of nudity and sexuality and on conversations with FB/IG following the petition has been a long time coming. I launched the change.org petition at the end of November, and by the week the ToS were going to come into force I ended up speaking with someone at Facebook policy – the day before my PhD thesis defense!

In December, my at Facebook policy contact reiterated the importance of engaging different communities to determine the effectiveness of and issues with the platform’s moderation. They were interested in learning more about the concerns and issues of users posting nudity and sexuality on Instagram.

As a result, @thequeenofsexy created a survey to gather these experiences, with my input and the input of @samantha.ssun, @fladdison, @the.nicky.ninedoors, @avahennessyy, @riotsandcrows, @jordankensley and @novacainedances_llc. I communicated the outcome of this survey to Facebook/Instagram in February.

It took me a long time to have some news because my contact went on maternity leave. Recently though, I’ve been able to speak to policy employees across the United States/European Union teams about the possibility to hold an online community event for FB to clarify their policies and hear communities’ concerns about posting content featuring nudity and sexuality.

A community event to discuss moderation policies?

The shape and ‘rules’ of the event are something FB/IG are still trying to figure out. They’re hoping to involve different team members who deal with different aspects of nudity policies, and they’re hoping to make the event accessible online to different user communities across different time zones.

The aim will be two-fold: giving users clarity about FB/IG’s moderation policies and their implementation, plus allowing them and their concerns to be heard. For the event to take place, I have to figure out different themes for FB/IG to address and to understand the shape this discussion would have.

Themes already identified through the petition and survey include:

  • Content recommendation policy (shadowban);
  • Solicitation policy;
  • Inconsistencies in the moderation of content by celebrity accounts;
  • What to do when accounts are wrongly deleted and/or shadowbanned;
  • Inconsistency in the moderation of accounts DMing / commenting harassment and people banned for nudity;
  • Account verification.

Specific, personal concerns about the moderation of single accounts will not be addressed because Facebook would have to review each accounts for that to happen, but I’d love for everyone to comment / get involved with their GENERAL concerns for me to identify more moderation themes to communicate to Facebook.

Post ToS Petition survey results

The survey created by @thequeenofsexy, which was shared and filled out by 250 accounts posting nudity and sexuality on Instagram, proved to be a very informative (and depressing) snapshot into IG’s moderation of this type of content. The survey already highlighted some key themes that are worth discussing further in a possible community event.

This survey, filled out by a majority of women (73%), revealed that 66% of respondents felt discriminated against by IG / FB. Respondents included an overlapping set of: lingerie/apparel accounts (65%), sex workers (46%), LGBTQIA+ accounts (46%), activists (43%), photographers/artists (43%), models (42%) pole dancers (36%), disabled people (15%), fat people (21%) and BIPOC accounts (9%).

The biggest challenges faced with the survey were related to people trusting to fill out something that would be communicated to Facebook, and in reaching BIPOC communities – 82% of our respondents were white.

What the survey really helped with with was highlighting moderation challenges and inconsistencies faced by users. 15% of the users surveyed had their account deleted without warning. Many claimed to have lost money due to to FB/IG’s moderation: 33% of respondents claimed to have lost hundreds in their respective currencies, while 12% lost near thousands and 9% lost about thousands. Issues users mentioned included:

  • Loss of engagement resulting in missed brand partnerships or loss of earnings
  • Loss of opportunities – e.g. TV series’ developments, performing opportunities or networking – due to account deletion and/or loss of visibility
  • Deletion of content and/or accounts that comply with community guidelines – e.g. fine art
  • Loss of contact with local networks and communities resulting in loss of information on food, taxes, medical and housing assistance
  • Deletion of sex education content previously featured in and endorsed by mainstream media outlets.

The main moderation challenges highlighted by the surveys were:

  • Inconsistency in deletion of content that seems to comply with community guidelines
  • Inability to be verified, resulting in exposure to scam artists and impersonation
  • Vague feedback on why accounts or content were deleted
  • Inconsistency in moderation of content by celebrities or large companies like Playboy – with users lamenting that simply reposting a Playboy post would result in their reposts being taken down for violation of community guidelines
  • Lack of access to FB/IG to ask for explanation in order to better comply with community guidelines.

These challenges had a major impact on users, who claimed to be so disheartened with Instagram’s moderation that they did not want to post anymore, or felt the powerlessness arising from working on the platform and facing heavy moderation resulted in depression, anxiety and gaslighting. This is consistent with Hacking/Hustling’s brilliant “Posting Into The Void” report, which is a must-read.

The inability to interact, advertise and keep in touch with customers also had an economic impact on many users, resulting in the worsening of their mental health due to the stress of not making ends meet.

My contacts at FB/IG were horrified when they heard some of the effects their moderation policies had on accounts posting content featuring nudity and sexuality. The survey highlighted the effects of lack of access to FB/IG and of poor communications with users affected by policies, showing that users’ inability to reach out to the platform has been a key moderation challenge on both sides.

Allowing users to voice their concerns and to have at least a partially direct line with FB/IG to understand how moderation policies are applied to accounts will hopefully address some of the issues highlighted in the survey.

What we know so far following communications with Facebook / Instagram’s policy teams

Aside from conversations about the potential to hold an event, relaying communities’ concerns and survey results to FB/IG after the petition provided some insights on the implementation of their moderation policies. Those insights relate to the shadowban, the moderation of celebrities’ accounts and Facebook/Instagram’s actions following mass account flagging.

It appears that what we knew as the ‘shadowban’ – a term that, FB/IG reiterated, they do not use – has changed. They now remove content from the explore page even if it isn’t ‘borderline‘ and doesn’t violate community guidelines, just out of editorial choice.

As we suspected though, moderation of content isn’t equal: celebrities’ accounts have a double review of content and that’s why they hardly get banned. It wasn’t clear whether the double review was doubly algorithmic, or starting with an algorithm and followed by a human moderator. Either way, everyday users don’t have the luxury of a double review – so even if their content is banned or censored wrongly by an algorithm, it takes longer for it to be restored… if it is restored at all.

Finally, when I mentioned accounts being taken down due to mass flagging – e.g. the incel campaign which resulted in Exotic Cancer’s account suspension in 2019 – FB/IG said they don’t tend to take down accounts in cases of mass flagging. They said they only do so if they violate policies. This, however, contradicts various accounts’ past experiences.

All of the above information raises important questions about the lack of platform-to-user communication that characterises our experience with Facebook and Instagram. Plus, it’s one more reason to advocate for more investment in human moderation: while it’s fair enough for celebrities to have a double review, it’s only fair that users that are often targeted with account or content deletion would have access to something similar, or that at least they have a more streamlined and direct communication process about the fate of their own content.

Where can moderation of nudity go from here?

News about the implementation of policies and these conversation come at a time when Facebook seems to want to be seen to be doing something about the human rights impact of their moderation.

In mid-March, the company announced they’ll be creating a “corporate human rights policy” based on the United Nations’ Guiding Principles on Business and Human Rights. While on the announcement page Facebook claim this policy will mean “changing key content policies, including creating a new policy to remove verified misinformation and unverifiable rumors that may put people at risk for imminent physical harm” and helping journalists tackle harassment, according to Bloomberg they are not making any changes to current community standards, privacy policies, or code of conduct. This policy seems to be more of a framework for how FB/IG plan to handle issues in the future, and potentially, another way to hold the company accountable if it fails. The immediate result of this new direction seems to be an annual report on Facebook’s impact on human rights and a fund for human rights defenders.

This is a welcome change which sounds major: as I argue in a recent academic paper, if social media spaces can be akin to shopping malls, they have to recognise the rights of their customers they host. Yet, to make sure these changes aren’t merely cosmetic, I am hoping that events engaging different communities – including accounts posting nudity and sexuality content – will help make Facebook more human rights compliant.

While we can’t be sure about whether discussing concerns and questions in a potential event will mean Facebook/Instagram’s policies will change, guidelines have in the past been informed by and updated through community engagement.

The fact that Facebook/Instagram are open to hearing from and interacting with users posting nudity and sexuality at all – even if the shape of this interaction hasn’t been decided and will most likely not please everyone – is a major step forward. Here’s to hoping.

Share and invite your followers to comment on this post to help me identify more issues with moderation

Pin This Post

3 Comments

Leave a Reply

%d bloggers like this:
Verified by ExactMetrics