Updates on shadowbanning, nudity and ads on Instagram

Policies on shadowbanning, visibility and content recommendation have been clarified by Instagram both in a recent post and in my recent communication with the policy team. This post sheds some light on those policies, questioning some of their elements and discussing what they may mean for accounts posting content featuring nudity and sexuality (including pole dance).

History of shadowbanning on Instagram

If you read this blog you’re probably already aware that content featuring nudity and sexuality – from sex education to sex work – isn’t always welcome on Instagram. Since 2019, the platform has been reportedly applying a policy known to users as shadowbanning, or a secret censorship where content isn’t shown to users outside your following, limiting your reach and the searchability of your account. Shadowbaning first started with sex workers, who are often targeted by social media platforms left, right and centre, then trickling down to a variety of content depicting nudity and sexuality, including pole dance.

After the shadowbanning of pole dancing hashtags, IG apologised to pole dancers through my blog in 2019. They denied the existence of shadowbanning until June 2020, when they acknowledged this was affecting black creators as a result of the #BlackLivesMatter protests.

Nudity and sexuality remain controversial for Instagram, with the platform often choosing to outright delete content depicting them or to at the very least limit their reach. IG’s new Terms of Use, published in December, cemented this belief that Instagram shouldn’t be a place for nudity and sexuality.

As a result of both my academic work (see my latest paper on the shadowban here) and of a petition (supported by over 100,000 people) I and various accounts created, I have been able to communicate with Facebook/Instagram’s policy team to convey the issues and concerns faced by users who experience heavy moderation on Instagram because of posting content featuring nudity and sexuality.

One of the main characteristics of my experiences as a censored user – and one many people I’ve spoken with share – was the lack of clarity by Instagram when it comes to the discrepancy between their policies and the way they have implemented them. Now, it seems, the platform is striving to provide further clarity to their users, both through public posts and through their communications with me. Here’s what I’ve been able to find out about how their recent updates are going to affect users like me.

Instagram clarified how shadowbanning works in 2021

After stating that users can’t trust what they don’t understand, last week Instagram addressed shadowbanning in a public post. They wrote:

We recognize that we haven’t always done enough to explain why we take down content when we do, what is recommendable and what isn’t, and how Instagram works more broadly. As a result, we understand people are inevitably going to come to their own conclusions about why something happened, and that those conclusions may leave people feeling confused or victimized.

Shedding More Light on How Instagram Works

As I argue in my recent paper, shadowbanning was once used by the American alt-right as a censorship conspiracy theory. The fact that platforms do not clarify their internal mechanisms makes this inevitable, turning shadowbanning and the algorithm into some sort of evil, mythical creatures that are sometimes blamed even for what would just be poor engagement and content that doesn’t perform well. This isn’t users’ fault: when faced with lack of certainty, it’s only natural that we’d want to find an explanation. It’s encouraging to see Instagram acknowledge this. They wrote:

We can’t promise you that you’ll consistently reach the same amount of people when you post. The truth is most of your followers won’t see what you share, because most look at less than half of their Feed. But we can be more transparent about why we take things down when we do, work to make fewer mistakes, and fix them quickly when we do, and better explain how our systems work. We’re developing better in-app notifications so people know in the moment why, for instance, their post was taken down, and exploring ways to let people know when what they post goes against our Recommendations Guidelines.

Shedding More Light on How Instagram Works

So in short, more clarity has been promised, but not yet delivered. On top of IG’s own comms, I’m really hoping that, as I’ve been discussing with the policy team this year, an online community event where users posting nudity and sexuality can ask the platform direct questions about how their content is moderated can happen later this year.

Content recommendation policies (aka the new shadowbanning)

Content recommendations on Instagram are essentially the policies they use to govern posts they choose to share and show to more people than your average followers. These are strictly related to shadowbanning because if your content is shadowbanned, it’s viewed as not worth recommending, and falls flat.

In last week’s post, IG explained that the platform doesn’t have one algorithm that oversees content on the app, but that there are a variety of algorithms, classifiers, and processes, each with their own purpose. Feed, Explore and Reels have tailored algorithms that are based on predictions and educated guesses according to your activity.

However, accounts posting nudity and sexuality related content are nonetheless affected by Instagram’s content recommendation policies, even if these are not called “shadowbanning” by the platform. IG’s recently published content recommendation guidelines state they “avoid making recommendations” of content that isn’t outright deleted, but “that could be low-quality, objectionable, or sensitive” or “that may be inappropriate for younger viewers.”

“Content that may be sexually explicit or suggestive, such as pictures of people in see-through clothing” falls within the realm of “content that impedes our ability to foster a safe community.” While a lot of content that is classed as unsafe on Instagram – e.g. discussions of self-harm, suicide, eating disorders, content depicting violence, content promoting the use of unregulated products (e.g. tobacco, drugs, adult services) – sounds like common sense, the fact that nudity is grouped up with eating disorders, violence and suicide is striking.

Is your account non-recommendable?

Despite what a variety of users I’ve spoken say, Instagram policy have told me that the chosen gender of an account doesn’t impact how they enforce their policies. This contradicts what a variety of accounts who have changed their gender to male have experienced in terms of engagement boosts. However, other aspects of an account’s actions do affect how and whether it is recommended.

In yet another interesting use of language, almost a throwback to the “borderline” or “vaguely inappropriate” days, Instagram also avoid recommending posts by “non-recommendable accounts”. Non-recommendable accounts have:

“1. Recently violated Instagram’s Community Guidelines. (This does not include accounts that we otherwise remove from our platforms for violating Instagram’s Community Guidelines.)

2. Repeatedly and/or recently shared content we try not to recommend.

3. Repeatedly posted vaccine-related misinformation that has been widely debunked by leading global health organizations.

4. Repeatedly engaged in misleading practices to build followings, such as purchasing likes’.

5. Have been banned from running ads on our platforms.

6. Recently and repeatedly posted false information as determined by independent third party fact-checkers or certain expert organizations.

7. Are associated with offline movements or organizations that are tied to violence.

What are recommendations on Instagram?

In short, if you’ve done any of the above – and this includes engagement pods, giveaways, being put on the naughty step for nudity… you are non-recommendable, aka shadowbanned.

Policies governing pole dancing, sex work and online moderation gray areas

Instagram’s Adult Nudity guidelines ban sexual intercourse and the showing of genitalia. Nipples are also banned, unless they are shown in protest, during breast-feeding, or in images depicting illnesses and/or cancer.

As already discussed in my post from December, the guidelines governing sexual solicitation have to meet a double strike of suggestive content and solicitation (together) for action to be taken by the platform. However, according to user feedback relayed to me by accounts in my communities, often content that doesn’t belong to any of the above actions has been flagged and/or deleted.

Instagram/Facebook’s “strikes” system governing whether your account can be taken down has also been clarified. The platform stated that they won’t count strikes on violating content posted over 90 days ago for most violations or over 4 years ago for more severe violations. They also won’t count strikes for certain policy violations, but the specificity of these isn’t clarified in the guidelines.

All strikes on Facebook or Instagram expire after one year. Strikes happen according to the following process:

“- One strike: Warning and no further restrictions.

2 strikes: One-day restriction from creating content, such as posting, commenting, using Facebook Live or creating a Page.

3 strikes: 3-day restriction from creating content.

4 strikes: 7-day restriction from creating content.

5 or more strikes: 30-day restriction from creating content.

Restricting accounts

The guidelines get less clear / more concerning when it comes to certain grey areas. For example, Instagram ban the “presence of by-products of sexual activity,” but what these are isn’t clarified. The use of sex toys, above or under clothing, isn’t permitted – which may affect sex toy brands and educators using sex toys in a non-sexual manner.

And then of course, there’s the amusing but appalling question of how we can or should grab our breasts. IG ban: “Squeezing female breasts, defined as a grabbing motion with curved fingers that shows both marks and clear shape change of the breasts.”

Why you may not be able to advertise your sexy brand on Instagram

A recurring complaint I hear from pole dance and sexy brands is that they can’t advertise on Facebook and Instagram.

Facebook/Instagram’s ads guidelines with regards to adult nudity and sexual activity are even more conservative than their content recommendation policies, and unlike with shadowbanning, your post may be straight up refused: “Ads must not contain adult content. This includes nudity, depictions of people in explicit or suggestive positions, or activities that are overly suggestive or sexually provocative.”

The below screenshot of the guidelines page can give you an idea about the type of content Facebook/Instagram consider suggestive:

This is confusing and disappointing, considering that some of these images are artistic and some would most likely be needed to advertise such things as underwear, polewear or fitness classes.

I asked the policy team more information about the criteria behind these images and guidelines, and while they’re still looking for more information about “implied nudity” – something which already affects TikTok users – they told me that pole dance and sexy brands’ ads should be allowed, but that there are some caveats.

They said that while the account who posted the content shouldn’t be a source for discrimination, the ad’s content itself might be: ads depicting things such as zooming in on buttocks and breasts, or grinding, could come into violations for Adult Content. In short, once again, through ads, nudity becomes a target on Instagram.

The new Victorians

It’s once again reassuring that Instagram is sharing more resources that clarify content moderation or practices known as shadowbanning. However, even if clarity is an improvement, this doesn’t mean that creators posting nudity and sexuality will have any easier a life on the platform.

In short, shadowbanning hasn’t changed – it’s just been moderately clarified. A notification for when your content isn’t recommended will go a long way, and I really welcome that change and hope it will come soon.

Still, clarity about internal policies doesn’t change Instagram’s conservative, puritan and blanket moderation approach when it comes to nudity and sexuality. Instagram’s Adult Nudity guidelines state that:

We restrict the display of nudity or sexual activity because some people in our community may be sensitive to this type of content. Additionally, we default to removing sexual imagery to prevent the sharing of non-consensual or underage content. Restrictions on the display of sexual activity also apply to digitally created content unless it is posted for educational, humorous, or satirical purposes.

Uncovered female nipples except in the context of breastfeeding, birth giving and after-birth moments, health-related situations (for example, post-mastectomy, breast cancer awareness or gender confirmation surgery) or an act of protest.

Instagram’s Adult Nudity guidelines

The above essentially means that because certain users are sensitive to nudity, or because some posts may not be consensual, ALL or great portions of nudity and sexuality on Instagram are restricted. This is like saying that just because there may be children going to the cinema, there should be no nudity or sex scenes in any film. It’s a lazy and blanket approach to moderation that still doesn’t consider the importance of nudity and sexuality as form of expression, education or work.

I’ve always had a problem with guidelines regarding nipples, too, because it means that women’s breasts aren’t a sexual object only when they protest, have cancer or have children. Yet, many of us see our breasts every day and do not view them as an invitation to sex.

On top of this, the fact that these guidelines – call them recommendation guidelines, shadowbanning, adult nudity guidelines or what have you – aren’t applied evenly to accounts with different levels of celebrity and notoriety still makes creators’ experience frustrating on Instagram.

In short, although I welcome the fact that Instagram have never been this clear – both in their public-facing communications and in their exchanges with me – the moderation of content featuring nudity and sexuality on the platform is still considerably more conservative than a lot of the content their users post. As Jillian C. York argues in her brilliant book, Silicon Values, social media platforms seem stuck in the Victorian era when it comes to governing nudity and sex.

Recommendations for fairer moderation

It’s reassuring to see a tech giant like Instagram take the critiques they have received into consideration. On top of the clarifications mentioned above, Instagram’s CEO Adam Mosseri has also recently published a post detailing the creation of a new Equity Team for Instagram. The team will look into fairer forms of account verification, tackling harassment and hate, focusing on algorithmic fairness and doing right by marginalised communities on the platform – all issues previously raised on this blog and by many users and researchers.

Mosseri wrote:

“We’ve created a dedicated product group the Instagram Equity team, that will focus on better understanding and addressing bias in our product development and people’s experiences on Instagram. The Equity team will focus on creating fair and equitable products. This includes working with Facebook’s Responsible AI team to ensure algorithmic fairness. In addition, they’ll create new features that respond to the needs of underserved communities.”

Equity Work

This is, once again, reassuring, but may not go far enough in terms of successfully governing something as divisive as nudity and sexuality. As an online moderation expert with a PhD in this field, here are my two cents to make it better.

In my First Monday paper, where I reimagine the definition of social media as “corpo-civic spaces,” I argue that international human rights standards should be at the core of online moderation. In short, if content doesn’t violate someone’s human rights (e.g. abuse or harassment, racism, misogyny, ableism) or local / international law (e.g. terrorism), it should stay up, even if it’s shocking or offensive to some.

In a corpo-civic system, platforms would need to take responsibility for the content posted on them, and governments should hold them to account to make sure that they do so. As corporate entities ruling over what has become our public square, they can’t decide to ban ALL potentially risky content. That’s just lazy moderation. Instead, platforms need to make considerable investments in better machine learning, more and more qualified human moderators and more efficient, direct appeals systems.

Finally, I believe a complete overhaul of content recommendation systems needs to happen to prevent platform and user-enforced censorship. Platforms can’t ban nudity just because some of their users are uncomfortable with it or shouldn’t see it. They should invest in a moderation system that is curated by users and isn’t just based on educated guesses – e.g. if a user is not ok with seeing nudity, but wants to see moderate violence, they should be able to state it upon the creation of their profile and throughout their experience on platforms. They shouldn’t have the power to rid the whole platform of nudity, or to take content down through mass flagging just because they don’t like it (as it’s happening on TikTok).

In short, Instagram is putting considerable effort into clarifying its policies, but the input from users posting nudity and sexuality is still very much needed, and I will keep fighting so that our voices are heard.

Pin This Post

Buy Me A Coffee

Like my work? Support me on Buy Me A Coffee.

Leave a Reply

%d bloggers like this:
Verified by ExactMetrics