What are our rights as users in the sexy social media Wild West? In case you missed it, I did a viral and my ass got seriously banned on TikTok a few weeks ago. The irony of a social media moderation and censorship researcher being censored isn’t lost on me, and it’s one of the reasons why I began campaigning against censorship in the first place. But after an equally viral interview about my experience with TikTok, I wanted to share a more in-depth look about what it feels like to be a social media user sharing content featuring nudity, sexuality and bodies. Why? Because my experience is yet another example of why, as a social media user, you have very little rights. And that should worry us all.
My experience with social media moderation and censorship
I am a Dr. in Criminology, with a PhD in online abuse, conspiracy theories, their moderation and the subcultures who engage with this type of content. I was in my second year of my PhD when, in early 2019, rumours of the Instagram shadowban started sweeping the pole dancing and stripper online communities.
At that time, due to my previous PR/journalism background, I thought it was only fair that, if I slagged off IG in a blog post, I’d have to give them the right to reply – and strangely enough, their comms team did reply. Following widespread hashtag censorship, in the summer of 2019 I got an apology from Instagram about the shadowban of pole dance hashtags. In the same year, I was one of the founders of the global #EveryBODYVisible campaign and have since then continued to fight against censorship through petitions, academic papers and blog posts.
While up until 2020 Instagram was probably THE platform for pole dancers, and fights against censorship were mainly IG related, the Internet landscape during Covid-19 has changed dramatically. Yes, I am still often shadowbanned on Instagram. But there are huge new players in the game, and they’ve hit the mainstream: from OnlyFans to TikTok, there are more platforms to link to and to engage in. And this shows that censorship and lack of user rights are an Internet-wide problem.
While I don’t have an OnlyFans, I got a TikTok in early 2020, frustrated with the censorship and slow growth I was experiencing on Instagram. I started documenting my experience on the Chinese-owned app around then, through this blog post. I found growing on TikTok easier – I got to about 8,000 followers after posting a viral video in less than a couple of months – but slowly, censorship and trolling started catching up with me. That made me drop TikTok for quite a while, but then I became more serious about social media growth again and I went back to it. And I went viral again, this time massively, with a short video about being a pole dancer WAP – with a PhD – shortly after finishing my doctorate this February.
@bloggeronpole This week I became a ##dr in ##criminology & a ##poledancer ##WAP – With A ##PhD! You can learn this ##polechoreo & more via my @buymeacoffee in bio 🥰
♬ original sound – bloggeronpole
The video hit 2M views in less than a week, making my following jump to nearly 70,000… and this is where the problem started, and when the sheer lack of user rights on social media became even more apparent to me. Because of the increase in following, I became the target of never-ending online abuse and nasty comments.
Don’t get me wrong: I have a whole PhD in this. I know these are scripts. I know people comment on something because they see someone with a large following, not a human. I also know that when something hits the “wrong” audience – i.e. people who disagree with you on values, aesthetics and the like – they abuse you not because they enjoy it, but most likely because they want to impose values and beliefs they are passionate about on you. So I wasn’t hurt by the comments themselves – it was more that I got anxiety from their intensity, and the idea of playing catch-up with swathes of nasty, misogynist bullshit often coming from teens who don’t know better and from actual Karens (that’s what they were called on their profile so I feel authorised to refer to the Karen meme even if I hate it) as much as from men.
The (ghost) rights of social media users
What started to really affect me was the inability to post because of the continuous reporting of my videos. As I wrote in my previous blog post about TikTok, to get banned on the platform you’d have to post genitalia or sex acts. Given that I posted neither, the fact that many of my videos were taken down for violation of adult nudity and sexual activity community guidelines can only mean that they were flagged by other users. The issue here is: how many flags does it take for content to be taken down?
My situation and stories from pole dancers and sex workers whose profiles got deleted outright seem to show that the mechanisms to protect users, like content flagging, can be used against them if enough people decide they don’t like some type of content. This isn’t due to this content violating guidelines, but simply to the TikTok algorithm mistaking what users might want or not want to see. This isn’t just happening to pole dancers – I was featured in a Business Insider story recently that focused on how the same thing was happening to transgender creators, who were shown to transphobic audiences.
Should we really let misogynists decide what content is or isn’t allowed on social media platforms? The obvious answer is no, yet once one of your videos is banned twice on TikTok – as it happened to me, with videos banned and restored once being banned again – you can’t appeal anymore. This is a clear example of TikTok infrastructure going against its users and, to be honest, operating nonsensically: why would you ban a video that has been restored by another moderator? Shouldn’t there be a way for this not to happen?
Either way, this really started to affect me: I was banned for posting for a while, even if I still had my profile, but I was afraid of being deleted. I was afraid that the trolls that had ganged up on me would tried to get me banned on Instagram too, where my main network lies. I was meant to post a video for a big brand partnership, but I was banned from posting – meaning that this could hit me in my earnings. So I started complaining on Twitter, and the wonderful Chris Stokel-Walker, who writes for the BBC, the New York Times and Wired among others, picked up on the irony of a social media moderation researcher being banned and wrote the Input story, which helped me get my posting rights back and showcase my research.
I wish that would have been the end of it, but a week after the story I woke up to my profile being fully deleted, only for it to reappear once Chris retweeted my thread about it.
Oh man. @tiktok_uk really hates me. Even following @stokel’s @inputmag story, and admissions from community managers that they got it wrong last week, today I logged in to find that my account was permanently banned. What up? https://t.co/oXUtwmIiy1 pic.twitter.com/UZS0zxhtoh
— Dr Carolina Are / Blogger On Pole (@bloggeronpole) March 2, 2021
What my TikTok drama shows is that, thanks to access to the media, I could get my profile back. Yet, if you don’t have access to journalists, if your story isn’t as ‘ironic’ as mine, you may stay banned. Why? Because it’s bloody hard to communicate with social media platforms like TikTok and Instagram when you’ve been unfairly banned.
Who are social media open to?
On March 8th 2021, I organised a collaborative session for the online MozFest, in the Openness forum of Mozilla’s Internet festival. Aimed at finding ways to moderate social media without affecting users’ rights and freedom of speech, the session featured different stakeholders and communities coming together to find a more equal, more helpful framework for moderation.
I wanted my session to fit into the Openness space because social media moderation raises important questions about who these platforms are open to. Are they really a space for everybody? In December, Instagram essentially told me they don’t believe their app is a space for nudity and sexuality, and that users wanting to post that should take it elsewhere. But where is this ‘elsewhere’, when users like sex workers and nudity and sexuality in general are getting kicked out of all mainstream social media, from TikTok to OnlyFans, which was essentially built by sex workers?
Governing the Internet and social media under a perspective of “if you don’t like this space, go elsewhere” rejects the fact that users should have basic rights when interacting with these platforms. Whether it’s about knowing what happens to their content, why it happens, or having the possibility to appeal and speak to actual humans at tech companies when something goes wrong, users’ rights are currently being failed by social media companies. Not all users can go elsewhere. And the space for nudity and sexuality on the Internet is shrinking, while social media go on acting like a government that dictates acceptability while showing evident bias.
Why discourse about censorship is unhelpful
To me, censorship matters for a variety of reasons. Having a journalism background, I was always fascinated by the fact that the press could justify reporting on someone’s mistakes by saying they were at odds with the person they were trying to portray. Well, censorship is at odds with the commitment to freedom of speech all platforms pretend to make. Plus, personally, it was also thanks to social media that I came out of my shell as a pole dancer, finding communities like @twerkologynation and @pdfilthyfriday to hype me up and turn me into the dancer and person I am today. As an abuse survivor who thrived in that supportive network, it saddens me that people who may need it might not have access to it in the future simply because social media hate ass and titties.
The “won’t somebody think of the children” argument is getting very old. Yes, a child may not understand nudity and may not be ready to see it on social media. But social media aren’t just for children. Placing social media moderation merely under a child-friendly ethical system hides the main reason behind censorship: money, advertising, fear of legal retaliation and, crucially, lazy moderation.
The growth in online content surrounding sex and body positivity, sex education, sex work, sensual performances, erotic art and the like shows there’s a demand for it. Yet, social media currently want to operate under a one-size-fits-all moderation approach, trying to keep all culture happy and, as a result, creating general dissatisfaction in their moderation.
Participants from my Moz sessions, who were coming from all walks of life (e.g. research, tech, civil society), argued for the importance of better filtering systems so that people wishing or not wishing to see specific content were empowered to rule over their newsfeeds, rather than letting platforms do so for them. We also need more human moderators, trained to understand the nuances, cultural differences and relevance of specific content, to determine whether it’s a violation or not.
We need better filtering of news feeds to appeal to both users who don’t or shouldn’t see sexual content, and to users who are interested in that type of posts. Excluding one or the other shouldn’t be an option. Yet, social media companies are so far making their own laws, with governments only intervening to regulate ad-hoc harms or issues… and resulting in laws that make social media platforms look after number one, pulling all content that might get them into trouble.
This has already happened with the approval of FOSTA/SESTA, the law set to target online sex trafficking which resulted in platforms getting rid of different types of nudity for fear to be seen to be facilitating said sex trafficking. Similarly, Facebook reacted to an Australian law asking them to pay publishers for using their content by pulling all news from the platform altogether. Both reactions don’t protect the rights and safety of vulnerable users – they are an example of platforms doing damage control and of protecting their own earnings. So how can we put social media users’ rights first?
A framework for human rights based social media moderation
In my paper “A corpo-civic space: A notion To address social media’s corporate/civic hybridity“, published in First Monday, I argue that social media governance needs to address platforms’ hybrid nature: that of being both a corporate and a civic space. Social media may have been created and owned by private companies, but they perform civic space functions too – like a shopping mall, or like the mainstream media. As a result, the freedoms and rights of those who exist and interact in those spaces need to be protected.
As I wrote in a recent piece for The Conversation, seeing social media as “corpo-civic” spaces would mean applying international human rights standards to content moderation, putting the protection of people above the protection of profits. This is not unlike what we’d expect from shopping centres, which may have their own private security policies but which must nevertheless abide by state law. Because social media are global platforms, and because many states agree to respect international human rights standards at least on paper, it will be important for the laws that govern social networks to be transnational.
While my experience of censorship is very specific, and while people may disagree with my governance frameworks, a problem remains: currently, social media are not open to everyone. Some accounts and some content is being unfairly banished, without explanations or possibility for appeal. Some of the mechanisms meant to protect users are being used against them. All the above highlight the lack of users’ rights in social media moderation, and this is why a social network governance based on human rights and on platforms’ corpo-civic status is at least a start in a conversation we need to be having.
[…] online moderation researcher at City, University of London, PhD, and TikTok creator who has previously spoken up about issues with nasty comments and moderation at TikTok, told Insider she wasn’t sure how effective the company’s “kindness” […]
[…] on the web moderation researcher at Town, University of London, PhD, and TikTok creator who has previously spoken up about concerns with horrible remarks and moderation at TikTok, instructed Insider she was not confident how powerful the company’s “kindness” […]
[…] online moderation researcher at City, University of London, PhD, and TikTok creator who has previously spoken up about issues with nasty comments and moderation at TikTok, told Insider she wasn’t sure how effective the company’s “kindness” […]
[…] online moderation researcher at City, University of London, PhD, and TikTok creator who has previously spoken up about issues with nasty comments and moderation at TikTok, told Insider she wasn’t sure how effective the company’s “kindness” […]
[…] online moderation researcher at City, University of London, PhD, and TikTok creator who has previously spoken up about issues with nasty comments and moderation at TikTok, told Insider she wasn’t sure how effective the company’s “kindness” […]
[…] online moderation researcher at City, University of London, PhD, and TikTok creator who has previously spoken up about issues with nasty comments and moderation at TikTok, told Insider she wasn’t sure how effective the company’s “kindness” […]
[…] on the net moderation researcher at Town, University of London, PhD, and TikTok creator who has formerly spoken up about problems with nasty responses and moderation at TikTok, instructed Insider she was not positive how powerful the firm’s “kindness” […]
[…] online moderation researcher at City, University of London, PhD, and TikTok creator who has previously spoken up about issues with nasty comments and moderation at TikTok, told Insider she wasn’t sure how effective the company’s “kindness” […]
[…] online moderation researcher at City, University of London, PhD, and TikTok creator who has previously spoken up about issues with nasty comments and moderation at TikTok, told Insider she wasn’t sure how effective the company’s “kindness” […]
[…] online moderation researcher at City, University of London, PhD, and TikTok creator who has previously spoken up about issues with nasty comments and moderation at TikTok, told Insider she wasn’t sure how effective the company’s “kindness” […]
[…] online moderation researcher at City, University of London, PhD, and TikTok creator who has previously spoken up about issues with nasty comments and moderation at TikTok, told Insider she wasn’t sure how effective the company’s “kindness” […]
[…] online moderation researcher at City, University of London, PhD, and TikTok creator who has previously spoken up about issues with nasty comments and moderation at TikTok, told Insider she wasn’t sure how effective the company’s “kindness” […]
[…] online moderation researcher at City, University of London, PhD, and TikTok creator who has previously spoken up about issues with nasty comments and moderation at TikTok, told Insider she wasn’t sure how effective the company’s “kindness” […]
[…] What became really clear on social media was the split between considerate people and brands who think of you as a human, and people and brands who only see you as a grid of pictures. Shout out to the amazing Pole Junkie team who personally messaged me and commented with their condolences. Bit less impressed with people DMing me random mundane requests, others telling me: “YOU SHOULD WRITE ABOUT [insert social justice issue]!” and other brands who messaged me pushing for reviews – after I had already delivered on everything we’d agreed on – with no mention of what was going on in my stories or personal page. To make it all worse, when all of this was going on, TikTok kept deleting my profile on and off due to mass flagging and trolling at the hands of people …. […]
[…] platform’s increased censorship following Chris Stokel Walker’s story can be found in this more recent blog post, where I address the powerlessness of being targeted with online abuse when you’re the one […]
[…] In an online world where spaces for nudity and sexuality are shrinking, it’s not about “where” you go – it’s about what is left. […]