Investigation of Child Exploitation on OnlyFans: Key Actions and Concerns

Image
Giulia Pasqualin
Updated 10:34 AM, January 02, 2025

Investigation Reveals Potential Child Exploitation on OnlyFans

An alarming discovery was made by Matt W.J. Richardson, the head of intelligence at The Canadian Open Source Intelligence Centre, who reported 26 OnlyFans accounts suspected of featuring child sexual abuse material (CSAM). These accounts, seemingly interlinked and possibly orchestrated by multiple content creators, displayed content involving underage girls, indicating a coordinated exploitative effort.

Immediate Response by Authorities and OnlyFans

Upon receiving the tip-off on December 16, the National Center for Missing & Exploited Children (NCMEC), a prominent U.S.-based organization, acted swiftly. Less than 24 hours after the report, OnlyFans removed all implicated accounts. While NCMEC did not comment on the specifics, they confirmed their routine process of evaluating such tips and coordinating with hosting companies to eliminate any exploitive content.

OnlyFans' Stance on Child Safety and Content Moderation

In response to the incident, OnlyFans reaffirmed their strict policies against CSAM, highlighting their rigorous content creator verification processes to ensure all are adults. The platform stresses its proactive approach in content moderation and its ongoing collaboration with NCMEC to combat inappropriate material. OnlyFans aims to maintain a safe environment for legitimate adult content creation through meticulous review procedures.

Connections and Characteristics of Flagged Content

Richardson, an expert in training law enforcement in open-source intelligence, noted that some of the flagged accounts were interconnected through promotional activities, hinting at control by a single entity or group. He further described that the content often showcased individuals with physical attributes unusually youthful for adults, raising alarms about the authenticity and legality of their age.

Concerns Over "Legacy CSAM" and Platform Responsibilities

Adding complexity to the issue, Richardson raised the possibility of "legacy CSAM," where explicit images taken of individuals before they turned 18 are posted after they reach legal age, perpetuating abuse. Legal experts have confirmed that such actions still constitute a violation of laws against child sexual exploitation.

Critical Observations on Youthful Imagery in Content

Eric Silverman, a researcher with Culture Reframed, expressed concerns over OnlyFans profiting from content portraying women with youthful features such as pigtails or braces—a depiction culturally associated with minors. This kind of portrayal not only challenges content moderation standards but also raises ethical and legal questions about the safeguarding measures of the platform.

Model and Effectiveness of OnlyFans' Safety Measures

OnlyFans reported that its pay-to-view business model provides an additional layer of scrutiny, less prevalent in more open networks which allow greater anonymity. However, incidents like these underscore the sophisticated challenges digital platforms face and their critical responsibility in policing content effectively to combat online exploitation.

Subscribe to Our Weekly Newsletter

📬 Get the latest tips, tricks, and updates for professional adult content creators.

Giulia Pasqualin is a digital privacy expert based in El Paso, Texas, specializing in privacy and data security for adult content creators at RhyteIt. Pasqualin advises creators on strategies to safeguard their online presence, protect their personal information, and navigate privacy challenges unique to the adult content industry. Her expertise helps creators build and maintain secure, professional brands on platforms like OnlyFans and Fansly.