LONDON — Meta’s Oversight Board — a panel of experts selected by the company to deliberate on content decisions — released a decision this week recommending the company clarify arbitrary and vague definitions concerning nudity, sexual activity and sexual solicitation.
The London-based Oversight Board issued the recommendations in the course of overturning Meta’s decisions to remove two Instagram posts in 2021 and 2022 depicting “transgender and non-binary people with bare chests.”
Both images were posted by the same account, which is maintained by “a US-based couple who identify as transgender and non-binary,” according to the document’s case summary.
According to the Oversight Board’s description, “Both posts feature images of the couple bare-chested with the nipples covered. The image captions discuss transgender healthcare and say that one member of the couple will soon undergo top surgery (gender-affirming surgery to create a flatter chest), which the couple are fundraising to pay for.”
After being flagged by Meta’s automated systems and being reported by users, Meta’s Community Standards moderators “removed both posts for violating the Sexual Solicitation Community Standard, seemingly because they contain breasts and a link to a fundraising page.”
Oversight Board Critical of Meta's Moderation Process
The document by the Oversight Board contains several passages criticizing Meta’s handling of potentially sexual content uploaded by users.
Meta’s internal guidance to moderators on when to remove content under the Sexual Solicitation policy, the Board opined, “is far broader than the stated rationale for the policy, or the publicly available guidance. This creates confusion for users and moderators and, as Meta has recognized, leads to content being wrongly removed.”
Meta’s Adult Nudity and Sexual Activity Community Standard — prohibiting “images containing female nipples other than in specified circumstances, such as breastfeeding and gender confirmation surgery” — is based on “a binary view of gender and a distinction between male and female bodies,” the Board wrote. “Such an approach makes it unclear how the rules apply to intersex, non-binary and transgender people, and requires reviewers to make rapid and subjective assessments of sex and gender, which is not practical when moderating content at scale.”
Meta’s restrictions and exceptions to the rules on female nipples, the document continued, “are extensive and confusing, particularly as they apply to transgender and non-binary people. Exceptions to the policy range from protests, to scenes of childbirth, and medical and health contexts, including top surgery and breast cancer awareness. These exceptions are often convoluted and poorly defined. In some contexts, for example, moderators must assess the extent and nature of visible scarring to determine whether certain exceptions apply. The lack of clarity inherent in this policy creates uncertainty for users and reviewers, and makes it unworkable in practice.”
The Board also found that Meta’s policies on adult nudity “result in greater barriers to expression for women, trans, and gender non-binary people on its platforms. For example, they have a severe impact in contexts where women may traditionally go bare-chested, and people who identify as LGBTQI+ can be disproportionately affected, as these cases show.”
The Board recommended that Meta “should seek to develop and implement policies that address all these concerns. It should change its approach to managing nudity on its platforms by defining clear criteria to govern the Adult Nudity and Sexual Activity policy, which ensure all users are treated in a manner consistent with human rights standards. It should also examine whether the Adult Nudity and Sexual Activity policy protects against non-consensual image sharing, and whether other policies need to be strengthened in this regard.”
A Poorly Defined Policy Against ‘Sexual Solicitation’
The Oversight Board also recommended that Meta “clarify its public-facing Sexual Solicitation policy and narrow its internal enforcement guidance to better target such violations.”
Instagram’s Community Guidelines state that “offering sexual services” is not allowed. This provision, the Board noted, “then links to Facebook’s Community Standard on Sexual Solicitation.”
Facebook’s policy reads, “We draw the line, however, when content facilitates, encourages or coordinates sexual encounters or commercial sexual services between adults. We do this to avoid facilitating transactions that may involve trafficking, coercion and non-consensual sexual acts. We also restrict sexually explicit language that may lead to sexual solicitation because some audiences within our global community may be sensitive to this type of content, and it may impede the ability for people to connect with their friends and the broader community.”
The list of contact information that triggers removal as an implicit offer of sexual solicitation, the Board noted, includes social media profile links and “links to subscription-based websites (for example, OnlyFans.com or Patreon.com).”
The Board ultimately found that the Sexual Solicitation Community Standard “contains overbroad criteria in the internal guidelines provided to reviewers. This poorly tailored guidance contributes to over-enforcement by reviewers and confusion for users. Meta acknowledged this, as it explained to the Board that applying its internal guidance could ‘lead to over-enforcement’ in cases where the criteria for implicit sexual solicitation are met but it is clear that there was ‘no intention to solicit sex.’”
Moreover, the fact that reviewers “repeatedly reached different outcomes about this content” suggested to the Board “a lack of clarity for moderators on what content should be considered sexual solicitation.”