Cam and fan site operators have a responsibility to know their content providers and performers. Accusations of underage or unverified individuals appearing on screen can be highly detrimental to a business. These types of accusations can lead to legal action, termination of merchant processing relationships and being permanently banned from card networks.
The card brands and acquiring banks are subjecting user-generated content to intense scrutiny, and asking platform operators for written procedures and policies on how they address complaints about nonconsensual publication and prohibited content. Are you ready? This article will explore the importance of verifying and monitoring your uploaders’ content to protect your reputation, safeguard your merchant processing relationships, and ensure a safe and enjoyable experience.
The last thing anyone wants is to become the focus of an investigation.
Identifying Prohibited Content
Reviewing all content before it’s published is a crucial step in safeguarding your business, and most site operators know what content is prohibited. However, it’s always a good idea to stay up to date on current regulations and ensure none of the following prohibited scenarios is occurring:
- Performers appearing to be intoxicated or appearing to have impaired judgment.
- Pets wandering through the scene or performance.
- Visible beer or liquor bottles; also, cigarettes should look like “real” cigarettes.
The last thing anyone wants is to become the focus of an investigation. Self-policing is key to establishing and maintaining healthy relationships with your merchant processor, acquiring bank and the card brands.
Using AI to Help Review Content
Technology has always played a significant role in the adult industry. Recently, platforms have started using artificial intelligence (AI) to scan and monitor video, text and photographs. The technology searches for potentially problematic content and flags it for review by a member of the platform’s moderation team. Using AI to monitor content is a great tool because it can scan hundreds of hours of footage in a fraction of the time it would take a person to review the same content — but AI is only one tool, not a solution.
While AI can scan footage and confirm that only verified performers are appearing in the scene or check the scene for prohibited content, there are instances where AI will flag content as a false positive or miss something entirely. The human review process is still the best preventative measure a platform operator can take.
Maintaining Logs of Reported Content and Performers
The best practice ultimately is a preventative approach, not a remedial one. In addition to age and identity verification, maintain logs of reported content and performers, and takedown requests. The log should include, at minimum, the nature of the complaint, the action that was taken and how long it took from complaint to resolution — it should be seven days or fewer.
The complaints may pertain to specific pieces of content or individual performers. When it’s the latter, a log of terminated or suspended performers should also be maintained, to prevent banned performers from signing up again. New performer applications should be checked against the banned performer list during the verification process.
Another great tool is to create a report button on every piece of content. Typically, the reports generated provide options for why the content is being reported and flag the content for review by a member of the moderation team. These reports should be addressed as quickly as possible.
Platform operators should track:
- Creator account signups and approvals.
- Content postings and removals.
- Account deactivations due to TOS violations.
- CSAM reporting.
- Law enforcement/legal requests and assistance.
- Copyright and privacy issues.
- Records of reports should be maintained for at least 12 months, but in perpetuity is best. Most merchant processors require the log of takedown requests to be submitted monthly, so if that process is already in place, you’re in pretty good shape.
The Bottom Line
If a site operator has questions about best practices or about what types of content may or may not appear on their platform, the best resource available is their payment processor. Partnering with a reputable and knowledgeable payment services provider goes a long way toward ensuring and maintaining compliance with the ever-changing digital landscape. The good folks who specialize in risk and compliance are always happy to help clients who call to ask about what’s allowed and what’s not.
Jonathan Corona has two decades of experience in the electronic payments processing industry. As chief operating officer of MobiusPay, Corona is primarily responsible for day-to-day operations as well as reviewing and advising merchants on a multitude of compliance standards mandated by the card associations, including, but not limited to, maintaining a working knowledge of BRAM guidelines and chargeback compliance rules defined in both Visa and Mastercard operating regulations.