Meta vows to take action after report found Instagram’s algorithm promoted pedophilia content

Meta has set up an internal task force after reporters and researchers discovered its systems helped “connect and promote a vast network of accounts” devoted to underage-sex content, The Wall Street Journal has reported. Unlike forums and file transfer services, Instagram not only hosts such activities but promotes them via its algorithms. The company acknowledged enforcement problems and has taken actions including restricting its systems from recommending searches associated with sex abuse. 

“Child exploitation is a horrific crime,” Meta told the WSJ in a statement. “We’re continuously investigating ways to actively defend against this behavior.”

Along with the task force, Meta told reporters that it is working on blocking child sexual abuse material (CSAM) networks and taking steps to change its systems. In the last two years, it has taken down 27 pedophile networks and is working on removing more. It has blocked thousands of related hashtags (with millions of posts for some) and took action to prevent its systems from recommending CSAM-related terms. It’s also trying to stop its systems from connecting potential abusers with each other.

However, the report should be a wakeup call for Meta, the company’s former security chief Alex Stamos told the WSJ. “That a team of three academics with limited access could find such a huge network should set off alarms at Meta,” he said, noting that the company far better tools than outside investigators to map CSAM networks. “I hope the company reinvests in human investigators.” 

Academics from Stanford’s Internet Observatory and UMass’s Rescue Lab were able to quickly find “large-scale communities promoting criminal sex abuse,” according to the report. After creating test users and viewing a single account, they were immediately hit with “suggested for you” recommendations of possible CSAM sellers and buyers, along with accounts linking to off-platform content sites. Following just several recommendations caused the test accounts to be inundated with sex-abuse content. 

“Instagram is an onramp to places on the internet where there’s more explicit child sexual abuse,” said UMass Rescue Lab director Brian Levine. The Stanford group also found that CSAM content is “particularly severe” on the site. “The most important platform for these networks of buyers and sellers seems to be Instagram.”

Meta said the company actively seeks to remove such users, having taken down 490,000 accounts violating child safety policies in January alone. Its internal statistics show that child exploitation appears in less than one in 10 thousand posts, it added. 

However, until queried by reporters, Instagram was allowing users to search terms that its own systems know may be associated with CSAM material. A pop-up screen warned users that “These results may contain images of child sexual abuse” that can cause “extreme harm” to children. However, it then allowed users to either “Get resources” or “See results anyway.” The latter option has now been disabled, but Meta didn’t respond when the WSJ asked why it was allowed in the first place. 

Furthermore, attempts by users to report child-sex content were often ignored by Instagram’s algorithms. And Facebook’s own efforts to exclude hashtags and terms were sometimes overridden by the systems, suggesting users try variations on the name. In testing, researchers found that viewing even one underage seller account caused the algorithm to recommend new ones. “Instagram’s suggestions were helping to rebuild the network that the platform’s own safety staff was in the middle of trying to dismantle.”

A Meta spokesperson said it’s currently building system to prevent such recommendations, but Levine said the time to act is now. “Pull the emergency brake. Are the economic benefits worth the harms to these children?” Engadget has reached out to Meta for comment. 

This article originally appeared on Engadget at https://www.engadget.com/meta-vows-to-take-action-after-report-found-instagrams-algorithm-promoted-pedophilia-content-133343896.html?src=rss