MENLO PARK, Calif. — Instagram published a blog post today announcing some changes in the app's moderation policy and appeals process.
The announcement outlined some changes that representatives of the Facebook-owned company had mentioned last month at the unprecedented meeting between the Instagram public policy team and a group of representatives from the Adult Performers Actors Guild (APAG) at Facebook's monumental Bay Area headquarters.
XBIZ attended the meeting and published an exclusive report the same day, chronicling the conversation between the sex workers’ advocacy group and several top-level Facebook/Instagram execs responsible for deciding which content is allowed on their platforms.
Today’s blog post described the new policy in the typically vague, imperious language common among the leadership of Facebook and other social media giants. With a combination of august arrogance, Sesame-Street-like language that is meant to appear friendly, harmless and inclusive, and a tone of finality and knowing-best, the statement uncannily mimics the public speech patterns of company figurehead Mark Zuckerberg.
“Today, we are announcing a change to our account disable [sic] policy,” reads the anonymous blog post, which an Instagram spokesperson confirmed as “official” and “quotable.” “Together with Facebook, we develop policies to ensure Instagram is a supportive place for everyone. These changes will help us quickly detect and remove accounts that repeatedly violate our policies.”
“Under our existing policy,” the statement continues, “we disable accounts that have a certain percentage of violating content. We are now rolling out a new policy where, in addition to removing accounts with a certain percentage of violating content, we will also remove accounts with a certain number of violations within a window of time.”
Besides this extremely vague reframing of Instagram's still-secret formula for content censorship, the company also introduced “a new notification process to help people understand if their account is at risk of being disabled.”
“This notification will also offer the opportunity to appeal content deleted,” Instagram stated.
These appeals “will be available for content deleted for violations of our nudity and pornography, bullying and harassment, hate speech, drug sales, and counter-terrorism policies, but we’ll be expanding appeals in the coming months. If content is found to be removed in error, we will restore the post and remove the violation from the account’s record.”
"We’ve always given people the option to appeal disabled accounts through our Help Center, and in the next few months, we’ll bring this experience directly within Instagram," the short blog post concluded.
Sex Workers' Concerns Remain
XBIZ contacted an Instagram spokesperson with relevant questions about the vexing new blogpost.
The company says that the goal is to make Instagram “a supportive place for everyone,” so we asked the rep whether legal sex workers and producers of adult content are included in “everyone.”
“This includes everyone who uses Instagram, irrespective of their profession,” the rep answered.
What is the rational in equating “nudity and pornography” with “bullying and harassment, hate speech, drug sales” and “terrorism”? Is Instagram implying that nudity (specified as female nipples and male and female genitalia and buttocks) and pornography (an often derogatory, stigmatizing term for adult content) is equivalent to all those other types of harmful content? (“Drug sales” is also problematic: how does Facebook define “drugs”?)
“No,” the rep clarified. “We have a set of rules that govern what you can or can’t post on Instagram — that’s not to say we equate the severity of one rule with another. Simply, this is a list of things we don’t allow. We do not allow the sales of any drug on Instagram, including illicit and pharmaceutical drugs.”
The blog post states that Instagram is “now rolling out a new policy where, in addition to removing accounts with a certain percentage of violating content, we will also remove accounts with a certain number of violations within a window of time.” Does Instagram intend to keep the “certain percentage,” “certain number” and “window of time” figures a secret, or will it announce what these numbers actually are?
“We don’t share these numbers as it allows bad actors to game our preventative measures,” said the rep.
What does “we’ll bring [the appeals] experience directly within Instagram” even mean?
“Currently people need to go to our Help Centre (a website) to appeal accounts that are removed from Instagram,” clarified the rep. “With this upcoming update, people can appeal directly in the app. In other words, this process will be a lot simpler for our community.”
Finally, we asked if Instagram was committed to keeping the platform “a safe and supportive place” for adults and sex workers?
“See above,” was the Instagram spokesperson’s succinct answer.
For more background on Instagram and the ongoing War on Porn, click here for the XBIZ Explainer and here for an account of the historic meeting between APAG leadership and the Facebook/Instagram public policy team.