TikTok moderators say they were shown child sexual abuse videos during training
Illustration by Alex Castro / The VergeA Forbes report raises questions about how TikTok’s moderation team handles child sexual abuse material — alleging it granted broad, insecure access to illegal photos and videos. Employees of a third-party moderation outfit...
A Forbes report raises questions about how TikTok’s moderation team handles child sexual abuse material — alleging it granted broad, insecure access to illegal photos and videos.
Employees of a third-party moderation outfit called Teleperformance, which works with TikTok among other companies, claim it asked them to review a disturbing spreadsheet dubbed DRR or Daily Required Reading on TikTok moderation standards. The spreadsheet allegedly contained content that violated TikTok’s guidelines, including “hundreds of images” of children who were nude or being abused. The employees say hundreds of people at TikTok and Teleperformance could access the content from both inside and outside the office — opening the door to a broader leak.
Teleperformance denied to Forbes that it showed employees sexually exploitative content, and TikTok said its training materials have “strict access controls and do not include visual examples of CSAM,” although it didn’t confirm that all third-party vendors met that standard. “Content of this nature is abhorrent and has no place on or off our platform, and we aim to minimize moderators’ exposure in line with industry best practices. TikTok’s training materials have strict access controls and do not include visual examples of CSAM, and our specialized child safety team investigates and makes reports to NCMEC,” TikTok spokesperson Jamie Favazza told The Verge in a statement.
The employees tell a different story, and as Forbes lays out, it’s a legally dicey one. Content moderators are routinely forced to deal with CSAM that’s posted on many social media platforms. But child abuse imagery is unlawful in the US and must be handled carefully. Companies are supposed to report the content to the National Center for Missing and Exploited Children (NCMEC), then preserve it for 90 days but minimize the number of people who see it.
The allegations here go far beyond that limit. They indicate that Teleperformance showed employees graphic photos and videos as examples of what to tag on TikTok, while playing fast and loose with access to that content. One employee says she contacted the FBI to ask whether the practice constituted criminally spreading CSAM, although it’s not clear if one was opened.
The full Forbes report is well worth a read, outlining a situation where moderators were unable to keep up with TikTok’s explosive growth and told to watch crimes against children for reasons they felt didn’t add up. Even by the complicated standards of debates about child safety online, it’s a strange — and if accurate, horrifying — situation.
Update August 6th, 9:30AM ET: Added statement from TikTok.