Instagram Refuses To Meet About Fairer Moderation

Yesterday, a group of artists, pole dancers, sex workers, allies, LBTQIA+ people and more gathered outside of Instagram’s London HQ for a dance party/ protest – only to be ignored by Insta. In the past month or so, more news have come out about Instagram and Facebook’s moderation techniques, EveryBODYVisible has come back from the Christmas holidays and I have spoken at an algorithm bias conference at Somerset House where I presented experiences from our campaign and my recommendations for fairer moderation. AFH really opened my mind to fantastic studies and huge issues with algorithmic moderation and regulation, so I thought I’d share some of these news and resources with you, together with others gathered in the past few months, to raise some important questions that, sadly, the platform still refuses to answer.

<img fetchpriority=
Picture by: Angela Christofilou

Instagram Dance Party By @katsandcrows

I’m writing this while still reeling from yesterday’s dance party outside Facebook and Instagram’s London HQ, organised by sex worker, performer and activist Rebecca Crow – aka @katsandcrows – bringing together sex workers, pole dancers, athletes, performers, LGBTQIA+ people, people of colour, allies and many more.

The dance party was an occasion to protest against IG’s unfair moderation the only way we know how: getting our heels on and dancing to some hits. We had some very interesting covers…

Side To Side – The Dance Party Version

It was so fantastic to meet likeminded people from all walks of life and to hear their incredible stories. We all came from different backgrounds, but we were united by the fact that up until now, Instagram had been a tool for empowerment, but that empowerment is slowly being taken away by IG’s current moderation process and community guidelines. I heard some heart-warming story of how people incorporate sex positivity and body positivity within their parenting, or stories of people in a wheelchair that came to support sex workers for the positive impact they had on their lives.

What we didn’t hear, unfortunately, was Instagram’s voice. They didn’t show up at all, they didn’t let us in for a meeting and, apparently, they snuck out the backdoor to prevent being locked in like last time Rebecca organised a protest. Back then, apparently, security barricaded IG employees into the building and they weren’t allowed to leave even after working hours. This time, they just remained silent and ignored us. Kinda like they do when we ask for clarity on how their algorithms work.

<img loading=
Picture by: Maria Evrenos

I was totally mind-blown to see people I didn’t know carry EveryBODYVisible sign and shout “Everybodyvisible” among the protesting chants. We’ve amassed a lot of following and media coverage in the past few months, but seeing that offline, with my own eyes, really was something.

<img loading=
Picture by: Angela Christofilou

Here’s Rebecca talking to EveryBODYVisible about the current state of Instagram’s moderation and how it affects a variety of communities. She shares some worrying facts about how celebrities’ accounts are completely whitelisted – even if their aesthetic is what Instagram would ban on a non-famous profile – and about how hackers have been exploiting ‘sexy’ IG account that have been banned.

View this post on Instagram

@katsandcrows on sworker rights, hacking scams & FB/IG DANCE PARTY PROTEST 30/1! @everybodyvisible is 𝐩𝐫𝐨𝐮𝐝 𝐭𝐨 𝐬𝐮𝐩𝐩𝐨𝐫𝐭 @Katsandcrows 𝐑𝐞𝐛𝐞𝐜𝐜𝐚 𝐂𝐫𝐨𝐰 𝐩𝐫𝐨𝐭𝐞𝐬𝐭𝐢𝐧𝐠 𝐈𝐧𝐬𝐭𝐚𝐠𝐫𝐚𝐦 𝐓𝐡𝐮𝐫𝐬𝐝𝐚𝐲 𝟑𝟎𝐭𝐡 𝐉𝐚𝐧𝐮𝐚𝐫𝐲 𝟏𝐩𝐦 𝐰𝐢𝐭𝐡 𝐚 𝐝𝐚𝐧𝐜𝐞 𝐩𝐚𝐫𝐭𝐲 𝐨𝐮𝐭𝐬𝐢𝐝𝐞 𝐭𝐡𝐞 𝐋𝐨𝐧𝐝𝐨𝐧 𝐈𝐧𝐬𝐭𝐚𝐠𝐫𝐚𝐦 𝐨𝐟𝐟𝐢𝐜𝐞 𝐍𝗪𝟏 𝟑𝐅𝐆!✊⬇️⬇️ READ ON…⁣ ⁣ ‘I’m Rebecca, I joined Instagram about 8 years ago, after joining @suicidegirls and seeing a whole bunch of the girls on the platform. At the time I was sharing photos of my food and cats. I’ve built my brand on Instagram now as a way to promote my work as a model and performer, plus trying to make the world a better place!⁣ ⁣ What pisses me off about Instagram?⁣ They’ve enabled several companies to sweep in and take advantage of peoples account deactivations by offering reactivations for a cost. They don’t address the blatantly obvious protection that celebrities and large corporations have from the Terms of Service, when regular people are left in the dark regarding their account deactivation. Instagram must take responsibility for the huge influence they have on public opinion and listen to the people it’s creating harmful stigma against.⁣ ⁣ After organising a protest outside the Instagram offices in London last year, and similar protests took place in the USA, we were hopeful for some communication. This has not been the case…..⁣ ⁣ ⁣ The unjust deactivations have been felt across so many communities and the impact is being felt by everyone, including myself. I had my Instagram deactivated between Christmas Day and NYE – my main sales period. â£ ⁣ I chose a dance party protest because I wanted to bring a positive vibe to this next protest. There’s a whole load of shit going on in the world right now and I felt like we needed it. â£ ⁣ At the last protest they hired extra security and blocked the doors in and out of the building, so I sure hope they can schedule a meeting before the end of their work day! ⁣ All bodies, all humans and all allies are invited!⁣ ⁣ (@katsandcrows CONTINUES IN COMMENTS)⁣ ⁣

A post shared by EveryBODYVisible (@everybodyvisible) on

The Latest In Instagram Moderation World

This month, former deputy PM Sir Nick Clegg said Facebook is failing to eradicate harmful content because it’s too big, prompting charities to call for the Government to “urgently” set up an online watchdog to protect children.

At around the same time, Facebook announced they’ll be hiring 1,000 more people in the UK to deal with harmful content. The BBC wrote that: “Pressure has been growing on social media firms to remove posts promoting self-harm and political extremism.” It remains to be seen how the new hires will approach nudity.

January was a whirlwind in Internet moderation world. As XBIZ reports, porn performer Alana Evans’ IG profile was removed in error after a user posted harassing comments her posts. The user has since been removed from the platform and Evans’ account has been restored. Instagram issued an official apology to Evans.

In the course of this investigation, we mistakenly disabled Alana’s Instagram account, Instagram Communications Officer Stephanie Otway told XBIZ. We sincerely apologise to Alana for this error, and we’ve taken steps to protect her account from future mistakes like this.

Instagram Communications Officer Stephanie Otway

Less related to nudity but still along the lines of Instagram’s ghosting of its users, according to a Vice article found that hacked Instagram influencers have to rely on hackers. Referring to internal documents obtained by the Motherboard team, they write:

Victims say that Instagram’s process for recovering accounts is so cumbersome that they’ve had to rely on third-party social media experts and, in some cases, white-hat hackers to help them regain access while Instagram itself was largely silent.

Vice

It seems like Alana has made some progress on the other side of the Atlantic though, as this Adult Performers Actors Guild tweet seems to show. Hopefully IG begin listening to other people affected as well.

Academics and Moderation: Slaves To The Algorithm?

But the news aren’t the only source examining Instagram’s unresponsiveness towards its audiences. On January 17th, I spoke at Algorithms For Her, a conference at King’s College London that focused on Algorithm Bias, presenting the work we did through EveryBODYVisible and sharing my recommendations for fairer moderation. Thanks to the event, I was able to meet a variety of inspiring academics that shared some of their work with me. Sadly, their work proves our point: we know almost nothing about Facebook and Instagram’s moderation techniques, and their lack of clarity is dangerous and damaging to their users.

In a new article published in the UNSW Law Journal, researchers Alice Witt, Nicolas Suzor and Anna Huggins sample 4,944 like images depicting women’s bodies to see how they were moderated on Instagram. They found that up to 22 per cent of images are potentially false positives, “images that do not appear to violate Instagram’s content policies and were removed from the platform,” – arguing that this result is a “significant cause for concern” for users on the platform and for moderation in general.

The researchers write:

Content moderation refers to the processes through which platform executives and their moderators – whether humans, artificial intelligence systems or both – set, maintain and enforce the bounds of ‘appropriate’ content based on many factors, including platform-specific rules, cultural norms or legal obligations. We argue that decisions around the appropriateness of content are ultimately regulatory decisions in the way that they attempt to influence or control the types of content we see and how and when we see it.

Alice Witt, Nicolas Suzor and Anna Huggins

As if this were not enough, Sophie Bishop’s article shows how constant changes in social media algorithms leave users to rely on ‘algorithm experts’ that share unscientific information and, ultimately, can profit from their need to reach new users. The article focuses on YouTube’s algorithm, but it can easily be applied to Instagram, further proving how damaging nebulous algorithms can be to social media users whose livelihood is based on engagement on these platforms.

Instagram Declined To Comment On Pretty Much Anything

I sent my usual contacts at Instagram’s press office some very specific questions about how their algorithms work, following on from recent news stories and announcements. They declined to comment – not sure if I’ve pissed them right off with the work we’ve been doing at EveryBODYVisible (which would be sad, as dialogue is what we’re asking for) or if they legit don’t know the answers. Her were my questions – guess they will go unanswered for a while…

  1. You are recruiting thousands more people to deal with harmful content online here in London. How will that affect women, sex workers, educators, performers, people of colour, LGBTQIA+ folks and athletes? 
  2. How will the new hires work with content containing nudity on Facebook and Instagram?
  3. How many people / reports does it take for a profile to be taken down?
  4. Incel Omid claimed to have taken down a porn star’s account this month, while dancers, performers and pole instructors are having a variety of their videos removed. What can users do when they feel they are being targeted on the platform?
  5. Can you clearly define harmful content in terms of nudity?
  6. Are you planning on revisiting your female nipple ban?
  7. How is age-gating working on Instagram and what are your experiences with this new feature so far?

What Can We Do About It?

  • Keep protesting.
  • Keep spreading the word about the lack of equality on social media platforms.
  • Keep ‘stalking’ your favourite creators, activists, athletes, sex workers etc and like their posts, comment on them, save them, boost their engagement.
  • If you’ve got a platform, a voice, some leverage, use it to support people who are being silenced. Sometimes, even a social media post can go a long way. When it’s not censored, that is.

Here’s some insightful and helpful posts by EveryBODYVisible in case you’re trying to understand or hack the algorithm.

View this post on Instagram

Swipe through to see exactly what is and isn’t allowed. ⁣3 will make your 🤯⁣ ⁣⁣ In general, both Facebook and Instagram now share policies, meaning that if content or behavior is considered violating on Facebook, it is also considered violating on Instagram. ⁣⁣ ⁣⁣ ‘FB/IG state: We restrict the display of nudity and sexual activity on Facebook. We make some exceptions when it is clear that the content is being shared in the context of a protest, for educational or medical reasons, or a similar reason. On the other hand, we default to removing sexual imagery to prevent non-consensual or underage content from being shared’ – Facebook Community Standards Enforcement report November 2019 (which included data on IG for the first time).⁣ ⁣ ⁣ New developments: 2020 will see the roll out of a FB/IG ‘Oversight Board’ employing 20 experts (eventually 40 experts). Funded by FB with a 130 million 💵 investment – but fully independent of Facebook, it will review cases of deactivated accounts and deleted posts, and advise and guide on fairer policy making (such as The Great Instagram Nipple Controversy). Cases to be heard will be selected by committee, and the results made public. We expect the queue to have cases heard to be long….⁣⁣ ⁣⁣ You may have missed the December introduction of an option to restrict your content to an 18+ audience on IG. So far this option is only available to Creator/Business accounts. The limited feedback we have heard from people who’ve tried it is ‘a small growth in followers (including fake ones)’ and ‘some messages not showing up in group chat’. ⁣⁣ ⁣⁣ Finally – thanks for your patience whilst we took a holiday break from posting. More shadowbanned user stories and ban-beating tips coming soon. Look forward to your comments – here’s hoping we all keep happy, healthy, creative and visible in 2020! ⁣ ⁣ ⁣ 🛠 #instagramtips #instagramtipsandtricks ⁣⁣ #instagramcensorship #biasedbanning #feminism #sexism #smashthepatriarchy

A post shared by EveryBODYVisible (@everybodyvisible) on

View this post on Instagram

This post covers FB/IG new-ish Sexual Solicitation rules because they’re currently catching out some artists, sworkers, performers, etc who haven’t realized they are inadvertently violating these guidelines.⁣ ⁣ Basically, IF you have ‘suggestive’ (but IG-censored e.g: pixilated/stickered/scantily-clad but not fully nude etc) sexy content on your feed/stories ⁣ ⁣ AND you OFFER or ASK viewers to contact you to see/buy the UNCENSORED version on your Pætrë0n/ ÕñlyFãñs page – ⁣ ⁣ THEN you can be banned, blocked or deactivated for ‘sexual solicitation’. ⁣ ⁣ To qualify as breaching the sexual solicitation guidelines, 𝐲𝐨𝐮𝐫 𝐜𝐨𝐧𝐭𝐞𝐧𝐭 𝐦𝐮𝐬𝐭 𝐛𝐫𝐞𝐚𝐜𝐡 𝐁𝐎𝐓𝐇 𝐨𝐟 𝐓𝗪𝐎 𝐜𝐫𝐢𝐭𝐞𝐫𝐢𝐚.⁣ ⁣ 1. Implicitly or indirectly 𝐎𝐅𝐅𝐄𝐑 𝐨𝐫 𝐀𝐒𝐊 for nudes, sex or sex chat & provide a method of getting in contact ⁣ ⁣ 𝐏𝐋𝐔𝐒 ⁣ ⁣ 2. 𝐂𝐨𝐧𝐭𝐚𝐢𝐧 ‘𝐒𝐮𝐠𝐠𝐞𝐬𝐭𝐢𝐯𝐞’ 𝐞𝐥𝐞𝐦𝐞𝐧𝐭𝐬. ⁣ FB explain: ‘Imagery of real individuals with nudity covered by human parts, objects or digital obstruction, including long shots of fully nude bottoms.’⁣ ⁣ And sexy emojis (👅peaches, eggplants, 💦 etc) are a big no-no, ⁣ along with ‘Regional sexualised slang’ ⁣ ⁣ Mentions/depictions of sexual activity (including hand-drawn, digital or real world art) such as: sexual roles, sex positions, fetish scenarios, state of arousal, sexual intercourse or activity (sexual penetration or self-pleasuring).’⁣ ⁣ You also can’t share links to external ‘pornographic’ sites (FB often seems to view Patr€0n/OF as porny) or use detailed sexually explicit language. ⁣ ⁣ To lower your risk. Don’t use a sexy profile picture. Don’t directly link to PatreOn or ÖnlyFañ$ and obscure them in captions. ⁣ Use linktree or better still, your own site and ensure the landing page isn’t explicit. ⁣ Remember IG can scan for skin & body hair as well as trigger words, links, DMs, stories, comments, hashtags and posts. ⁣ Check your hashtags aren’t spammy, banned or restricted (no recent posts showing). ⁣ ⁣ The sexual solicitation rules are clearly designed to make IG-life VERY hard for SW and are causing problems for other IG users as well. ⁣ ⁣ Please share your thoughts & tips in comments. 💜 & thanks. Stay safe.

A post shared by EveryBODYVisible (@everybodyvisible) on

For now, over and out.

3 Comments

  1. […] feedback on their nudity, sexual activity and sexual solicitation policies. This was no mean feat given that the last time I was anywhere near Meta’s HQ I was protesting in stripper shoes with sex…. To be let in, to be listened to, to be able to bring the concerns of various censored communities […]

Leave a Reply

Verified by ExactMetrics