A joint effort among pole dancers across the world, a petition with nearly 20,000 signatures and a request for clarity about algorithms and the shadowban resulted in Facebook and Instagram apologising to pole dancers about censoring their posts on the social network today. This post contains that apology and hopes to shine some light on the Instagram algorithm following the platform’s answers.
The Petition
Last week, some of the world’s most famous pole dancers, pole studios and pole dance Instagram accounts started a petition asking Instagram to stop censoring pole dance. The petition, which reached over 17,000 signatures as of this morning, came after hashtags pole dancers use to find moves and express themselves were shadowbanned, or hidden from the wider community, because they were deemed inappropriate.
The Change.org petition was started by Rachel Osborne. Since then and since the posts I wrote last week, the #femalefitness hashtag together with many pole dance related hashtags have become visible again on Instagram.
Signed “Sincerely, the Pole Dancers of Instagram”, the minds behind petition and the press release that made the rounds in the mainstream media and social media were:
- United Pole Artists – 183k followers, info/marketing/consulting for pole industry. @upartists
- Pole Dance Nation – 238k followers, by Nikki St John @poledancenation
- Michelle Shimmy – 187k followers, international pole artist, instructor and businesswoman. @michelleshimmy
- Elizabeth Blanchard – 75k followers, international instructor, aerialist and kinesiologist. @Elizabeth_BFit
- Anna-Maija Nyman – 90k followers, athlete, Nordic & 5x Swedish National Pole Champion. @annamaijanyman
- Dan Rosen – 56k followers, competition athlete, instructor, UK Male Pole Champion. @danrosenpole
- Laura Arbios – Owner Sadie’s Pole Studio, Founder Pole Dancers’ Vote. @sadiespolestudio
- Meg Lee – 24k followers, competition athlete, instructor, studio owner. @megthesupernova
- Makeda Smith – 3.4k followers, publicist, instructor, Pole priestess and the PR mastermind who distributed the release to media. @flyingover50
- Rachel Osborne – 11k followers. Housewife. Mum. Pole dancer @TropicalVertical
I later joined the team to speak to Instagram – with whom I had already been in contact for my previous posts – to get some clarity about how their algorithm works.
Our Questions To Instagram
As the petition started to pick up, we asked our followers and the pole community as a whole to comment with their questions and concerns about the shadowban and the Instagram algorithm.
As a result, we asked Instagram these questions via press@instagram.com.
- How does the algorithm detect inappropriate content?
- How long for are hashtags removed from the “recent” tab? Is there a way to speed up the process of reinstating hashtags and removing the shadowban from a person’s profile?
- How does the “reporting” process work? E.g. how many people should report a user or a hashtag for them to be shadowbanned / for their account to be disabled?
- Will the “inappropriate content” policy evolve? Some users are suggesting we should have a warning about content being inappropriate / shadowbanned before posting, so we can take it down before the shadowban starts if that is the case.
- Is the algorithm trained to notice hashtags first, or does it also detect nudity? How does nudity detection work?
- Do you recommend posting hashtags in comments or is that viewed as spam by the algorithm?
- How does the algorithm distinguish between artistic and sexual nudity?
- Is there a difference between bikini / polewear and lingerie content?
- Does being affected by banned hashtags mean you are also blacklisted for your account to be disabled?
- Some users are lamenting that they find their accounts have automatically unfollowed accounts they didn’t want to unfollow, while others are noticing huge drops in following overnight. Does Instagram unfollow ‘inappropriate’ accounts for users?
- How come are ‘risqué’ celebrities’ accounts not disabled or shadowbanned?
- Users who report issues to Instagram – e.g. accounts being shadowbanned, disabled etc – lament that they are not getting any responses from the team. How long does the average answer take?
- Does the algorithm rank business accounts lower than personal ones in terms of views?
- Is nudity higher up than, say, hunting posts, gun posts, harassment?Users have lamented those are not being deleted when reported.
- How are moderators trained to report inappropriate content e.g. whose ethos do they follow other than community guidelines?
Instagram’s (and Facebook’s) Apology
In their response to our questions, Instagram told me via email that determining what a global community may find offensive or inappropriate is subjective, and that they’re working hard to get this right.
“A number of hashtags, including #poledancenation and #polemaniabr, were blocked in error and have now been restored. We apologise for the mistake. Over a billion people use Instagram every month, and operating at that size means mistakes are made – it is never our intention to silence members of our community.”
Facebook company spokesperson
This apology was sent to me via email on July 31st 2019. If you don’t believe it, CTV News Canada verified it with Instagram here.
While Instagram told me that their content reviewers receive what they call extensive training to moderate, they have still given us no insight about what that training is. They admitted however they won’t always get moderation right. Specifically, in the case of pole dance, they found that a number of hashtags were found to have been mistakenly removed. They has since been restored – probably thanks to our community’s commitment to boycott the ban, to constant posts and news about it. Go team!
So… after our team efforts we got one of the world’s biggest tech giants to admit they were wrong. Considering I’ve been speaking to them since March and got nowhere near an apology, this is pretty huge. Especially because all press comments I’ve received from Instagram have to be quoted “on background” – e.g. paraphrased without attributing them to a spokesperson. This is what I’ve been doing since March, but this time I was given that direct quote above by a Facebook company spokesperson.
Yet, what I would LOVE is if they could also apologise to sex workers, sex positive educators, sexy brands, performers and the like who have felt wrongly discriminated by their censorship in the past year or so.
Of course, the apology doesn’t mean things will change. The remaining paragraphs will attempt to translate Instagram’s answers to our questions to provide some clarity about using the platform to still share your pole progress.
Instagram’s New “Nudity Appeal” Feature
Instagram have acknowledged their mistakes above and want to give the community the option to let them know about them, so they can correct them. Because of this, they told me they are introducing an appeals process – not unlike the one for copyright – for nudity too.
Soon we will be able to appeal when our posts are taken down – basically request a second review of posts containing nudity. The process will be as follows:
- If your photo, video or post has been removed for violating Instagram’s Community Guidelines, you will be given the option to “Request a Review”.
- This sends the post for another review from a different reviewer. If they think they have made a mistake, Instagram will restore the content.
- We will get a notification about the decision made regardless of the outcome.
Whether adding an appeals process can help accounts like ours remain to be seen, but at least the community’s request for a chance to protest against censorship is being listened to. The scary downside to this is that more ‘strikes’ will get your account disabled, as Instagram recently announced. Plus, the answers above refer strictly to posts that have been taken down – not to the shadowban. More on that below.
Instagram’s Nudity Regulation Policy Doesn’t Distinguish Between Art and Sex
As they already told me in March, Instagram are aware that there is creativity or art in nudity. However, this time they straight-up told me they don’t allow nudity on the platform – and it’s to avoid making their global community uncomfortable.
Nudity for Instagram includes:
- Photos, videos, and some digitally-created content that show sexual intercourse, genitals, and close-ups of fully-nude buttocks
- Some photos of female nipples (they still haven’t clarified which, even if I’ve asked)
- However, photos of post-mastectomy scarring, women actively breastfeeding or breasts where the nipples are censored by blurring are allowed.
Although Instagram haven’t answered our question directly, by saying they do not allow nudity on their platform they mean they do not distinguish between art and sex.
Without going into the ridiculousness of needing to remove close-ups of fully-nude buttocks, it is clear that many pole dancers wear polewear, bikinis or lingerie covering the parts mentioned by Instagram… and still get shadowbanned.
Plus, although Instagram said they do allow photographs of paintings, sculptures, and other art that depicts nude figures, Exotic Cancer constantly finds her art removed from the platform, so take even this statement with a pinch of salt.
If You Have A Business Page, Your Instagram Posts Might Get Less Engagement
Instagram claim not to have made any recent changes to their feed’s ranking algorithm since March 2016. That’s when they introduced feed ranking, because they found that too many people were missing their friends and families’ posts, and they didn’t want accounts with frequent posts to dominate everyone’s feeds.
The feed ranking view ensures people now see more than 90% of their friends’ posts – a number that was less than half back when the feed was chronological. This way, the average post from a normal account now reaches 50% more of its followers than it did with the chronological model.
As a result of this, Instagram told me that content from public figures, businesses or publishers is not ranked higher than content from “normal” users.
Translated, this means that if your pole dance account is a business account (e.g. for a school, a personal performer page, a venue, or even if, like me, you are a blogger), your business account will be seen less than normal accounts, counting as “friends and family”.
What I’ve found in my PhD research is that this feed ranking change – which started on Facebook – has had a way bigger effect than just pole dancers not getting likes (which, tbh, still sucks). Seeing posts by “smaller” accounts has brought less-than-verified Facebook pages to the forefront of people’s feeds, circulating the kind of extreme content that brought us Brexit and Trump, and making it mainstream belief. So aside from just Cambridge Analytica, it was social media’s own infrastructure that changed news consumption and, as an effect of that, people’s opinion. If you’re curious about this, I really recommend one of my PhD sources, a book by Lynne Schofield Clark and Regina M. Marchi.
How Instagram’s Explore Page Works
Instagram told me they are working towards making content on their platform both safe and appropriate. This, for them, means restricting what goes on the Explore and hashtag pages – essentially, the vital pages we need to grow our accounts.
Because of this policy, not all posts or accounts are eligible to show up on the Explore and Hashtag pages. Instagram determine eligibility according to a variety of signals – for example, if an account has recently violated Community Guidelines, it might not be shown. In short, a shadowban inside the shadowban.
So what can we do to find hashtags and pole dancers that we want to see or that we care about on Instagram? A whole load of northing, as it looks like from Instagram’s responses. We have to actively remember who we like and check their profiles constantly, and show them our love. We need to make lists of people and moves we like the old-fashioned way. Never thought it’d come to this, but Instagram’s responses are not giving me much hope.
How Instagram’s Shadowban Of Hashtags Works
Last week, Instagram told me that if they see a rising amount of violating content (such as nudity or spam) on a hashtag, they will shadowban it. They never used the S word with me, but said they will temporarily remove the “Most Recent” section and only display “Top Posts” of the hashtag which is what we call a shadowban.
Instagram also told me they no longer show hashtags related to a shadowbanned hashtag. So what most likely happened last week then is that Instagram censored one “#pdsomethingsomething” hashtag, meaning that all the other #pd hashtags we use to look up moves have been affected by censorship as a result.
#throwback to Facebook’s Blueprint for Content Governance and Enforcement – A Prelude To The Instagram Shadowban
Remember when you used to post pole videos to Facebook? Remember when you stopped bothering? I do. I stopped bothering because sometimes the algorithm thought I was naked and blocked me from interacting with my account (news flash: I wasn’t naked). Or maybe I used a song I liked for my freestyles, and BOOM! Copyright police. Video muted or removed. So I moved to Instagram instead.
This started happening about last year, and now I remember why. In our very active Instagram chat featuring the gang of shadowban fighting pole dancers I mentioned above, @tropicalvertical dug up Mark Zuckerberg’s 2018 “Blueprint for Content Governance and Enforcement” which contains all the seeds and signs for the censorship that is now happening on Instagram – mainly with regards to nudity and what he defines as “borderline” content.
Apparently the algorithm has a knack for identifying butts and is a bit slower on hate speech, which is more contextual. This explains why, more often than not, groups I research for my PhD who share harassing content are still allowed to thrive. The Instagram algorithm is still learning to detect hate speech, but has almost fully learnt how to take down nudity – which means that, in answer to our question, it’s not that the ‘Gram views nudity as more problematic than harassment, but just that it hasn’t yet found out how to teach its AI nuance:
“Some categories of harmful content are easier for AI to identify, and in others it takes more time to train our systems. For example, visual problems, like identifying nudity, are often easier than nuanced linguistic challenges, like hate speech. Our systems already proactively identify 96% of the nudity we take down, up from just close to zero a few years ago. We are also making progress on hate speech, now with 52% identified proactively. This work will require further advances in technology as well as hiring more language experts to get to the levels we need.”
Mark Zuckerberg – A Blueprint for Content Governance and Enforcement, 2018
Now, as I’m sure you are all aware, Facebook also owns Instagram and Whatsapp – so inevitably, changes to Facebook trickle down to the ‘Gram. Zuckerberg uses research to state that: “One of the biggest issues social networks face is that, when left unchecked, people will engage disproportionately with more sensationalist and provocative content,” which can be true and has surely had a part to play in the 2016 US Election and Brexit referendum. Yet, he uses this research to essentially back up Facebook and Instagram’s use of algorithms to remove the majority of content they call “borderline” and cover their backs. In short: tech companies got scared after Cambridge Analytica, and they don’t want to have to testify in Congress no more.
Zuckerberg goes on to explain how Facebook’s algorithm and content governance regulate “borderline” content, a prelude to the Instagram algorithm’s “vaguely inappropriate” content detection and shadowban:
“Our research suggests that no matter where we draw the lines for what is allowed, as a piece of content gets close to that line, people will engage with it more on average — even when they tell us afterwards they don’t like the content. This is a basic incentive problem that we can address by penalizing borderline content so it gets less distribution and engagement. “
Mark Zuckerberg – A Blueprint for Content Governance and Enforcement, 2018
So Facebook went and changed their idea of what we should and shouldn’t post on the platform. Zuckerberg says: “By making the distribution curve look like the graph below where distribution declines as content gets more sensational, people are disincentivized from creating provocative content that is as close to the line as possible.”
In short, no engagement on your “provocative” content = chilling effect on your wish to even post it. And this is where pole dancers, sex-positive accounts, sex workers and sex educators come in:
“Interestingly, our research has found that this natural pattern of borderline content getting more engagement applies not only to news but to almost every category of content. For example, photos close to the line of nudity, like with revealing clothing or sexually suggestive positions, got more engagement on average before we changed the distribution curve to discourage this. The same goes for posts that don’t come within our definition of hate speech but are still offensive.”
Mark Zuckerberg – A Blueprint for Content Governance and Enforcement, 2018 (the italics are mine).
Facebook and Instagram’s algorithm are now trained to detect our bodies and not show them, because they are inappropriate. Zuckerberg clearly states it: “We train AI systems to detect borderline content so we can distribute that content less.” So that’s it, folks. Our bodies are borderline.
In my eyes, this blueprint is a scary example of why you shouldn’t let business people with robots run giant companies with the potential to influence elections or referendums and to have an impact on civil liberties without much legal control or advisory.
This is bigger than the pole dance community, than our wish to express ourself. This is about how a company run by one of the most powerful cis white males in the world has trained an algorithm to decide that women’s bodies in particular are borderline; that nudity is essentially all bad, and should be taken down with no chance to appeal. Scary times.
If I think about how much the people I engage with on Instagram changed my life and my world view, it makes me sad to see all this happen. As an abusive relationship survivor that was able to accept herself again through pole dancing, seeing body-positive pole dancers, dancers affected by mental health issues, or people with similar problems to mine thrive through the platform gave me hope. I think mine and many pole dancers’ following doesn’t come just from tits and ass, but from the fact that people see growth, a journey, they saw us struggle and thrive, they saw different sides of us like we saw different sides of them.
It’s sad to see that, because of this censorship – once again, likely due to FOSTA/SESTA although Instagram hasn’t admitted this – we will only be able to see the clean, American Dream, Mattel-packaged version of people.
Questions Instagram Has Not Yet Answered
As you might have noticed from the list above, not all questions coming from the pole community were answered by Instagram. I have sent them another email in response to the answers we just got, asking for further clarity and more answers to specific questions. In the meantime, aside from more details on how posts are reported and for how long accounts can stay disabled or shadowbanned, we are still waiting to find out about the following.
- How come are ‘risqué’ celebrities’ accounts not disabled or shadowbanned? (It’s not hard to find out why, but we’d like to hear it from Instagram);
- Do you recommend posting hashtags in comments or is that viewed as spam by the algorithm?
- Users who report issues to Instagram – e.g. accounts being shadowbanned, disabled etc – lament that they are not getting any responses from the team. How long does the average answer take?
- How are moderators trained to report inappropriate content e.g. whose ethos do they follow other than community guidelines?
Importantly, we got no answer to this question: Does Instagram unfollow ‘inappropriate’ accounts for users? This is an issue that is affecting many people, as Jordan Kensley writes in her stories.
We are still waiting for responses to these questions, and this blog post will be updated once we know more.
More Resources On How The Shadowban and The Instagram Algorithm Works
- In this post, we decided not to cover music as that is a separate issue from content moderation in terms of nudity and breach of community guidelines. However, if you want to read what Instagram told me about copyright, you can read my first interview with them here.
- If you want to read my analysis of what the shadowban and recent changes in the Instagram algorithm mean for social media, read my post here.
Update – 1 August 2019, 8.00 pm
I started working on this post late last night and published it this morning. I had gone back to Instagram for more answers. Although they refused to comment on some of the questions we posed, they wrote back about allegedly unfollowing accounts for us.
Once again, this is on background and I can’t say I’m surprised about the answer: Instagram deny unfollowing accounts for other users. What they do admit is that they are working to remove inauthentic likes, follows and comments from accounts that use third-party apps (e.g. like or follow/unfollow bots) and have removed millions of fake, inauthentic or automated accounts from Instagram as a result. More info on this here.
Of course, this doesn’t explain the experience that Jordan Kensley and other pole dancers are having (Instagram unfollowing real accounts they interact with everyday). But this is all Instagram were willing to say. The other questions remain unanswered.
Dear Instagram, you can go FUCK YOURSELF! 🖕🖕
What can we do if our account has been fully shadowbanned?
What can we do if our account is currently in the shadowban within the shadowban? I no longer show up in any hashtag results for anything?
I’ve asked them this but got no reply 🙁 so I have no clue unfortunately!
[…] An apology directly from Facebook (which owns Instagram). […]
[…] Na koniec zachÄ™cam do dokÅ‚adnego wczytania siÄ™ w relacjÄ™ Caroline Are, która dokÅ‚adnie opisuje caÅ‚Ä… bataliÄ™ przeprowadzonÄ… z Instagramem. Kliknij tutaj. […]
[…] Instagram issued their ‘apology’, some of us still continued the fight away from the spotlight. Our Facebook groups filled with […]
[…] obtaining an official apology from Instagram over the summer, a group of activists and pole dancers I’m part of created #EveryBODYVisible, an […]
[…] is the campaign and hashtag created by the group that obtained an apology from Instagram over the summer, after the platform censored pole dancing hashtags “in error”. If you read this blog, […]
[…] is the campaign and hashtag created by the group that obtained an apology from Instagram over the summer, after the platform censored pole dancing hashtags “in errorâ€. If you read this blog, you’ll […]
[…] in a morality ‘purge’. In August, Instagram (and, therefore, its parent company Facebook) had to apologise to both pole dancers through this blog and to Carnival dancers, too, for censoring their posts. In both cases, the […]
[…] year, IG reportedly told me they do not discriminate against pole dancers and sex workers, only to apologise for moderating their hashtags “in error” through this blog, it’s fair enough to take their statement with a pinch of […]
[…] with Instagram throughout 2019 and 2020, starting from my first post about the shadowban, onto their apology through my blog for censoring pole dance and throughout the latest developments regarding censorship on the platform, from verification to […]
[…] terminated. Some cases are pure discriminatory towards certain group of people. For example, the censorship of pole dancer on instagram although later they […]
[…] the shadowbanning of pole dancing hashtags, IG apologised to pole dancers through my blog in 2019. They denied the existence of shadowbanning until June 2020, when they acknowledged this was […]
[…] to flag images of people with certain body types—particularly women of color and fat women—and often catch vanilla profiles in their ever-expanding net. Other platforms like AirBnB have admitted to using AI to target sex […]
[…] to flag images of people with certain body types—particularly women of color and fat women—and often catch vanilla profiles in their ever-expanding net. Other platforms like AirBnB have admitted to using AI to target sex […]
[…] enough, it was Instagram’s shadowban that convinced me to go public. When I obtained an apology from IG about shadowbanning, I realised my research focus on platform governance made my experiences research-worthy, and gave […]
[…] to flag images of people with certain body types—particularly women of color and fat women—and often catch vanilla profiles in their ever-expanding net. Other platforms like AirBnB have admitted to using AI to target sex […]
[…] deleted or shadowbanned – or hidden from platforms’ main pages without the user knowing, something Instagram apologised to pole dancers for through my blog in 2019. Marginalised creators have been particularly affected by this, being progressively excluded from […]
[…] while Twitter was never as hostile to sex workers as, say, Meta, it maintains what one Twitter employee told me is a “Don’t Ask, Don’t Tell” policy towards […]
[…] while Twitter has never been as hostile to sex workers as say, Meta, maintains what a Twitter employee told me is our “don’t ask, don’t tell” […]
[…] while Twitter was never as hostile to sex workers as, say, Meta, it maintains what one Twitter employee told me is a “Don’t Ask, Don’t Tell” policy towards […]
[…] while Twitter was never as hostile to sex workers as, say, Meta, it maintains what one Twitter employee told me is a “Don’t Ask, Don’t Tell” policy towards […]
[…] while Twitter was never as hostile to sex workers as, say, meta, it maintains what a Twitter employee told me is a “don’t ask, don’t tell” […]
[…] while Twitter was never as hostile to sex workers as, say, Meta, it maintains what one Twitter employee told me is a “Don’t Ask, Don’t Tell” policy towards […]
[…] whereas Twitter was by no means as hostile to intercourse employees as, say, Meta, it maintains what one Twitter worker informed me is a “Don’t Ask, Don’t Tell” coverage in […]
[…] whereas Twitter was by no means as hostile to intercourse staff as, say, Meta, it maintains what one Twitter worker informed me is a “Don’t Ask, Don’t Inform” coverage […]
[…] while Twitter was never as hostile to sex workers as, say, Meta, it maintains what one Twitter employee told me is a “Don’t Ask, Don’t […]
[…] while Twitter has never been as hostile towards sex workers as, say, meta, it maintains that a Twitter employee told me it has a “don’t ask, don’t […]
[…] while Twitter was never as hostile to sex workers as, say, Meta, it maintains what one Twitter employee told me is a “Don’t Ask, Don’t Tell” policy towards […]
[…] while Twitter was never as hostile to sex workers as, say, Meta, it maintains what one Twitter employee told me is a “Don’t Ask, Don’t Tell” policy towards […]
[…] while Twitter was never as hostile to sex workers as, say, Meta, it maintains what one Twitter employee told me is a “Don’t Ask, Don’t Tell” policy towards […]
[…] while Twitter was never as hostile to sex workers as, say, Meta, it maintains what one Twitter employee told me is a “Don’t Ask, Don’t Tell” policy towards […]
[…] while Twitter was never as hostile to sex workers as, say, Meta, it maintains what one Twitter employee told me is a “Don’t Ask, Don’t Tell” policy towards […]
[…] while Twitter has never been so hostile to sex workers as, for example, deadIt maintains what a Twitter employee told me is a “don’t ask, don’t tell” […]
[…] embora o Twitter nunca tenha sido tão hostil às trabalhadoras do sexo como, digamos, Meta, mantém o que um funcionário do Twitter me disse ser uma política de “Não pergunte, não […]
[…] while Twitter was never as hostile to sex workers as, say, Meta, it maintains what one Twitter employee told me is a “Don’t Ask, Don’t Tell” policy towards […]
[…] while Twitter was never as hostile to sex workers as, say, Meta, it maintains what one Twitter employee told me is a “Don’t Ask, Don’t Tell” policy towards […]
[…] while Twitter was never as hostile to sex workers as, say, Meta, it maintains what one Twitter employee told me is a “Don’t Ask, Don’t Tell” policy towards […]
[…] while Twitter was never as hostile to sex workers as, say, Meta, it maintains what one Twitter employee told me is a “Don’t Ask, Don’t […]
[…] although Twitter has never been hostile to sex workers, such as, Meta, it maintains what a Twitter employee told me is a “Don’t Ask, Don’t Tell” […]
[…] olarak, Twitter seks i?çilerine hiçbir zaman, örne?in, Meta, bir Twitter çal??an?n?n bana söylediklerini bize kar?? “Sorma, Söyleme” […]
[…] while Twitter was never as hostile to sex workers as, say, Meta, it maintains what one Twitter employee told me is a “Don’t Ask, Don’t Tell” policy towards […]
[…] while Twitter was never as hostile to sex workers as, say, Meta, it maintains what one Twitter employee told me is a “Don’t Ask, Don’t Tell” policy towards […]
[…] while Twitter was never as hostile to sex workers as, say, Meta, it maintains what one Twitter employee told me is a “Don’t Ask, Don’t Tell” policy towards […]
[…] while Twitter was never as hostile to sex workers as, say, Meta, it maintains what one Twitter employee told me is a “Don’t Ask, Don’t Tell” policy towards […]
[…] while Twitter was never as hostile to sex workers as, say, Meta, it maintains what one Twitter employee told me is a “Don’t Ask, Don’t Tell” policy towards […]
[…] while Twitter was never as hostile to sex workers as, say, Meta, it maintains what one Twitter employee told me is a “Don’t Ask, Don’t Tell” policy towards […]
[…] punct de vedere istoric, Twitter nu s-a adresat niciodat? lucr?torilor sexuali, de exemplu, Metamen?ine o politic? „Nu întreba?i, nu spune?i”, ceea ce mi-a spus un angajat de pe Twitter. […]
[…] while Twitter was never as hostile to sex workers as, say, Meta, it maintains what one Twitter employee told me is a “Don’t Ask, Don’t Tell” policy toward […]