Instagram and Facebook knowingly platform parents who sexually exploit children for profit, say reports

Illustration: Nick Barclay / The VergeInvestigations into “child influencer” accounts on Facebook and Instagram have found that Meta is knowingly allowing parents who sexually exploit their children for financial gain on the platform — and in some cases, using...

Instagram and Facebook knowingly platform parents who sexually exploit children for profit, say reports

Investigations into “child influencer” accounts on Facebook and Instagram have found that Meta is knowingly allowing parents who sexually exploit their children for financial gain on the platform — and in some cases, using Meta’s paid subscription tools to do so. 

According to separate reports published by The New York Times and The Wall Street Journal on Thursday, Facebook and Instagram have become a potentially lucrative endeavor for parents who run social media accounts for children — mostly girls — who aren’t old enough to meet the platforms’ minimum 13-year-old age requirements. Several of the “parent-managed minor accounts” investigated sold materials to their large audiences of adult men, including photos of their children in revealing attire, exclusive chat sessions, and their children’s used leotards and cheer outfits.

Meta staffers found that some parents were knowingly catering content of their children to pedophiles

According to The Wall Street Journal, while these parent-run accounts don’t feature illegal content or nudity, staff at Meta discovered that some parents were knowingly producing material of their children that pedophiles would find sexually gratifying. This included parents having sexually charged conversations about their own children and making them interact with sexual messages sent by subscribers. Meta staff also were allegedly aware that the company’s algorithms promoted subscriptions for accounts that feature child models to suspected pedophiles and that some parents offered additional content of their children on other platforms.

Meta did not immediately respond to a request for comment from The Verge.

Because of the way Meta’s social media algorithms work, even accounts that aren’t intentionally insidious — like those for child models, athletes, and performers — stand to benefit from gaining large audiences of adult men. The Times reports that 5,000 accounts it examined had 32 million connections to male followers. Accounts with high follower counts have their visibility boosted by Instagram, which can lead to discounts and financial incentives from brands. Some companies pay child influencers $3,000 for a single post, and six-figure incomes can be made through monthly subscriptions, according to the Times.

Recommendations made by Meta staff to tackle the issue — such as requiring accounts that sold child-focused subscriptions to register themselves for monitoring or banning subscriptions to such accounts entirely — were apparently not pursued by the company. Instead, Meta focused on building an automated system for preventing likely pedophiles from subscribing to parent-run accounts, though this proved to be unreliable and easily evaded by creating a new account, says the Journal.

Meta’s own moderation tools restricted parents who blocked too many accounts from blocking suspected predators

While it was building this system, Meta expanded its subscription program and “gifts” tipping feature, arguing that such programs are well monitored. The Journal also found that this gifting tool has been misused and that efforts made by some parents to manage who was interacting with their children were thwarted by Meta’s own moderation tools — with accounts that blocked too many followers in a day having their ability to block accounts restricted.

For comparison, TikTok told the Journal that it bans the sale of underage modeling content on TikTok marketplace and through its creator monetization services.

The Times also highlighted the company’s inadequate moderation attempts, noting that Meta responded to just one of the 50 reports the publication made regarding questionable content featuring children over a period of eight months. One internal study conducted by Meta in 2020 and revealed in court documents found that 500,000 child Instagram accounts had “inappropriate” interactions every day.

Meta already has a poor reputation regarding child protection on its platforms, with Instagram and Facebook being accused of creating a “marketplace for predators in search of children” in a lawsuit raised by the New Mexico attorney general in December. The Journal has notably published several reports over the last year that showed how Instagram and Facebook were used to promote sexually explicit or suggestive materials of children to pedophiles.