Numerous online platforms, including Kik, Twitter, Reddit and MindGeek, have been sued by civil claimants seeking to hold them responsible for sex trafficking activities by their users. These lawsuits have significant consequences for adult industry website operators, billing processors and performers. The legal standard to which online platforms are held in these instances may soon be decided by the U.S. Supreme Court. This article will explore the types of allegations made against platform operators and the changes we are seeing in the adult industry as a result.
Why would a large online platform ever be involved with sex trafficking? Naturally, any association with such nefarious activities is bad for business, terrible for public relations and inconsistent with basic human dignity. In reality, however, sex trafficking statutes are intentionally broad, and seek to punish anyone who benefits from trafficking or those who participate in a sex trafficking venture, regardless of their level of involvement. Online platforms have been named as defendants in civil sex trafficking claims based on their age verification procedures, content moderation policies, takedown responses and general business practices. Understanding how this can happen starts with a review of the underlying criminal laws.
Sex trafficking cases have been brought against businesses that are typically far removed from any such illegal activities, such as hotels or truck stops.
As with many criminal offenses, the key factor in determining whether someone is culpable as part of a criminal venture turns on whether the defendant had “knowledge” of the illegal activities. The federal prohibition on sex trafficking, 18 U.S.C. § 1591, prohibits anyone from “knowingly” recruiting, enticing, harboring, transporting, providing, obtaining, advertising, maintaining, patronizing or soliciting a person through force, fraud or coercion, to engage in a commercial sex act.
Further, the statute punishes anyone who benefits financially, or by receiving anything of value, from participating in a venture that has engaged in the activity described above. The defendant must have acted knowingly or in reckless disregard of the fact that force, fraud or coercion was used to procure the commercial sex act. Bottom line: anybody remotely connected with sex trafficking can be brought into a sex trafficking case.
This is particularly true given the existence of accomplice liability laws that prohibit conspiracy, or aiding and abetting any substantive offense. Aiding and abetting requires substantially assisting another in violating the law, while conspiracy simply requires an agreement to violate the law — plus an overt act in furtherance — without the need to prove any completed crime.
Sex trafficking cases have been brought against businesses that are typically far removed from any such illegal activities, such as hotels or truck stops. Until FOSTA/SESTA (“FOSTA”) was passed in April 2018, online platforms were protected from civil sex trafficking claims by Section 230 immunity. However, FOSTA changed all that. Now, online platforms can be liable in civil courts “if the conduct underlying the claim constitutes a violation of section 1591 [the criminal statute].” This awkward wording, allowing for civil liability only if the claim constitutes a criminal violation, has led to conflicting interpretations in the courts as to when and how online platforms can be successfully sued for sex trafficking violations.
In civil courts, defendants are often held liable for their negligent actions even if they did not intentionally commit harm. Failure to clean up a spill in a retail store, which results in a slip and fall by a customer, is an example of such liability. The store owner did not intend for anyone to be hurt, but can be held financially responsible nonetheless. When it comes to sex trafficking violations, civil claims have been asserted when a defendant “should have known” of the illegal activity, even in the absence of actual knowledge.
This illustrates the legal principle of “constructive knowledge.” Defendants are deemed to know the things they should have known with reasonable inquiry. They can also be found liable if they turn a blind eye to unlawful activity. However, when it comes to online platforms that disseminate speech, there are legitimate First Amendment issues implicated by holding a platform operator responsible for things it arguably should have known, but did not, in fact, know. Imposing such a broad standard of liability on platforms can result in a “chilling effect” on speech, which causes the platform to moderate or censor broad swaths of user speech in an effort to avoid potentially being held liable for actions that a civil claimant might think the platform should have known.
Potentially in recognition of this concern, FOSTA requires that civil claimants demonstrate a criminal violation of sex trafficking laws before liability will attach. However, the courts have been divided on whether the actual knowledge standard or the constructive knowledge standard prevails when suing a platform for sex trafficking violations.
Plaintiffs have filed lawsuits against numerous online platforms based on allegations that they failed to promptly take down content in response to abuse notices, failed to impose sufficient age verification procedures on users, failed to aggressively monitor content for illegal activities or did not pay attention to allegations of unlawful conduct by their users. A fact potentially missing in many of these cases is any allegation of actual criminal conduct by the platform operator itself. It would be highly unlikely that a large platform operator would knowingly participate in a sex trafficking venture through its own actions. Therefore, civil claimants have sought to rely on the concept of “constructive knowledge” in the attempt to attach liability for unlawful conduct of third-party users.
Any effort to determine whether a platform “should have known” that its users were involved in sex trafficking opens a Pandora’s box of numerous potential indicators of such knowledge. Did the platform miss one email, out of many thousands of routine support messages, demanding takedown of alleged recordings of forced commercial sex acts? Did the platform fail to implement robust age verification procedures, using the latest technology, before allowing a creator to upload content? A wide variety of operational decisions could be brought into question when evaluating a constructive knowledge standard.
Attempting to comply with this standard can result in a never-ending undertaking designed to enhance legal compliance efforts, moderate content and verify users. The inclusion in these cases of service providers such as hosts and billing companies, as defendants, can exacerbate the compliance burdens as platforms try to comply with rules and guidelines imposed by their service providers, in order to stay in business. The impact is ultimately borne by adult content creators who are faced with increasing barriers to their ability to monetize content.
However, a recent decision from the Ninth Circuit Court of Appeals offers some logic to the legal analysis, and a ray of hope for both platforms and content creators. On October 24, 2022, the Ninth Circuit decided Does v. Reddit, which addressed, head on, the level of knowledge required to assert a valid sex trafficking claim against an online platform. In Reddit, the platform was sued, on a class-action basis, for allegedly allowing explicit images of minors to be posted on various “sub-reddit” accounts and failing to prevent the reposting of the materials after removal. Further, the plaintiffs in that case alleged that Reddit permitted the labeling of accounts with terms that suggested underage content, such as: /r/BestofYoungNSFW, r/teensdirtie, /r/TeenBeauties and /r/YoungGirlsGoneWildd. Finally, they claimed that Reddit received advertising revenues from these channels while failing to track offending IP addresses, and that it “delayed” implementation of potentially available content moderation tools such as PhotoDNA.
The Ninth Circuit analyzed the civil sex trafficking statute, 18 U.S.C. § 1595, which imposes liability on violators who “knew or should have known” that they were engaging in sex trafficking, in comparison with the criminal statute which requires actual knowledge of such activities. Remember, FOSTA removes Section 230 immunity, but only if the claim constitutes a violation of the criminal prohibition. Reddit argued that a website may only be liable for its own criminal conduct. However, the plaintiffs argued that a website may be liable when someone’s conduct — likely a user’s conduct — violated the criminal statute, and that the claim against the website somehow “derives” from that violation. Ultimately, the court ruled that FOSTA requires that the platform directly violate the criminal statute or have actual knowledge of the illegal activity, for the Section 230 immunity exception to apply.
This ruling clarified a complex web of lower court decisions that struggled with the required level of knowledge to hold an online platform liable for civil sex trafficking claims. One of these decisions even found that Visa could be held liable for providing payment processing services to Pornhub.com, based on conspiracy claims. However, the plaintiffs in Reddit have asked the U.S. Supreme Court to review the Ninth Circuit’s decision. For now, the requirement that civil claimants demonstrate actual participation in, or knowledge of, sex trafficking by online platforms is the law of the land in the Ninth Circuit, which encompasses all of the West Coast of the United States. The decision also constitutes persuasive authority for the remaining courts and has brought some stability to the legal climate facing online platforms and, by extension, content creators who rely on such platforms.
Should SCOTUS grant review of the decision, the legal standard applicable to civil sex trafficking claims against platforms could again be in doubt. Given the other issues facing the adult entertainment industry, such as age verification laws, content filter requirements, payment processor rules and social media restrictions, the pending case may not have garnered much attention. However, if the Supreme Court justices accept the case, and alter the Reddit decision mandating actual knowledge of sex trafficking before civil liability can be imposed, widespread uncertainty could result, along with the imposition of more restrictive barriers to sexual expression.
This single dispute involving Reddit could dictate the rules that must be followed by adult content creators for generations to come. Free speech advocates are following the case with intense interest, and remain hopeful for the correct result.
Lawrence Walters heads up Walters Law Group, which represents clients worldwide in all facets of the adult entertainment industry. For more information, visit Walters Law Group online at FirstAmendment.com and @walterslawgroup on social media.