NEWCASTLE, U.K. — Digital studies and sexuality researcher Dr. Carolina Are is asking sex workers, adult performers and others who have experienced discrimination to participate in a study investigating Instagram and TikTok’s approach to malicious flagging or reporting of "gray area" content, including nudity.
Are is seeking participants over 18 years of age who have received negative comments and simultaneously had their accounts and/or content removed.
Are told XBIZ that, in the absence of communication or transparency about content governance from social media platforms, her two-year study “aims to infer Instagram and TikTok’s approach to moderating ‘grey area’ content by focusing on user experience, in the hope to gain information about platforms’ processes to make the case for fairer moderation.”
Are described “grey area” as referring to “content that social media community guidelines and moderators have so far struggled to moderate,” such as “journalism, education, nudity, activism.”
Are plans to circulate an anonymous survey and then interview specific case studies. Those wishing to share their experience with social media discrimination can fill out the survey here.
“It only takes a few minutes, and people can get into as much detail as they feel they need to,” Are explained.
A Corporate Culture of Denial About Discrimination and 'Shadowbans'
Social media platforms, Are said, “often deny that malicious reporting has an influence on accounts and content deletion, but my own personal experience and a variety of users’ experiences seem to prove otherwise. For example, every time my TikTok account was deleted — a whopping four times in 2021, with three times happening during the same week — it was after I received an avalanche of negative comments about my pole dancing posts not being appropriate for a social network ‘for children.’ While I was the one getting misogynistic insults and rape threats, it was my content that was being removed.
“We do know that once accounts are removed, it’s very difficult to speak to a human within platform teams to get them reinstated, leaving users out of a network and a tool to make a living for months, sometimes years on end.”
While Are does not think social media platforms are intentionally plotting against sex workers and sex-positive users, she points out that “given these users’ already precarious existence on social networks, and the fact that their content has been disproportionately targeted by platform governance, malicious flagging can become — and, according to some users, is already becoming — a crippling online abuse technique.”
“It’s incredibly important towards users’ online lives and livelihoods that we find out more about it,” she concluded.
Are is an Innovation Fellow at Northumbria University's Centre for Digital Citizens, researching the intersection between online abuse and censorship. Her work on social media moderation, platform governance and algorithm bias has been published in Feminist Media Studies, Porn Studies, First Monday and Journalism, and featured by the BBC, the Atlantic, MIT Technology Review, Business Insider, Vice, Wired and Mashable.
Are is also a blogger and content creator herself, as well as a writer, pole dance instructor and recipient of the Sexual Freedom Awards title, “Activist of 2019.”
Are added that although the focus of her current study is on Instagram and TikTok, if someone has experienced censorship only on one of the two platforms, that’s not a problem.
“I’m looking for an intersectional picture here,” she added, specifically inviting BIPOC, LGBTQIA+, plus-size and other marginalized users to take part.
For more information on Dr. Carolina Are, visit BloggerOnPole.com and follow her on Twitter.