Meta said Wednesday that end-to-end encryption would now be the default setting for messages and calls on Messenger and Facebook, two of the most widely used apps on the internet.
The technology, which uses cryptography to add a layer of security to digital communications, has been widely promoted and lauded by security experts, who have said that it can prevent intrusions like unlawful surveillance and certain types of hacking.
But child safety groups and certain law enforcement organizations are calling for a reversal of Meta’s rollout, saying that default encryption will be a devastating blow to detecting and stopping child exploitation on Messenger and Facebook.
The dispute is the latest development in a debate that’s raged for years between privacy advocates pushing for more widespread use of end-to-end encryption and groups concerned about digital crime.
Meta had already rolled out opt-in encryption, but Thursday’s announcement now makes encryption automatic on all messages.
The National Center for Missing and Exploited Children called the announcement a “devastating blow to child protection.”
The nonprofit organization praised Meta’s previous work on the topic, noting that the company reported more than 20 million incidents of the transmission of child sexual abuse material on its platforms last year, more than any other tech company, but warned that the expansion of encryption means “images of children being sexually exploited will continue to be distributed in the dark.”
The criticism comes as Meta faces intense scrutiny over child safety on its platform. On Wednesday, New Mexico Attorney General Raúl Torrez announced that his office was suing the company, calling Facebook and Instagram “breeding grounds” for predators targeting children. And in January, Meta CEO Mark Zuckerberg is scheduled to testify before a Senate committee about child exploitation issues on social media.
Meta is one of many companies that have adopted end-to-end encryption in their apps. WhatsApp, which is also owned by Meta, has featured end-to-end encryption since 2016, as do other messaging apps such as Signal and Telegram.
End-to-end encryption uses technology to digitally encode the content of messages in ways that make them almost impossible to decode, meaning companies that operate apps can move messages without the ability to read them.
The Canadian Centre for Child Protection, a Canadian organization that works to improve child safety, said that it believed Meta’s action would drastically cut the number of reports that it says are crucial to pursuing child exploitation cases.
“Millions of reports by Meta that drive Canadian and international law enforcement responses against child sexual abuse and exploitation will soon cease to exist,” the group wrote. “This decision by Meta means it will lose its ability to effectively monitor these crimes unfolding across large swaths of their platforms, including Facebook and Instagram.”
Responding to the criticism, a Meta spokesperson said, “We don’t think people want us reading their private messages so have spent the last five years developing robust safety measures to prevent and combat abuse while maintaining online security. These include defaulting anyone who is under the age of 16 to more private settings when they join Facebook and limiting adults from sending private messages to teens if they aren’t friends. We continue to strengthen our enforcement systems to root out potentially predatory accounts.”
Gail Kent, director of messaging policy at Meta and a former British law enforcement official, said that she does anticipate reports from the company to decrease, but said that it has been working on machine learning technology to detect publicly posted signals that will help it identify potential predators earlier, leading to more comprehensive reports to law enforcement and reporting organizations.
“What we’re doing here is a net benefit to the safety of users including victims,” she said, emphasizing that increasing privacy can help protect marginalized groups who are vulnerable to censorship or surveillance.
“As a society, we should be stopping that harm from happening before it takes place,” Kent continued. “It’s a big change that we’re asking from law enforcement, I completely understand that.”
Some members of law enforcement have already spoken out about the changes.
James Babbage, director general for threats at the United Kingdom’s National Crime Agency, said in a statement: “As a result of Meta’s design choices, the company will no longer be able to see the offending occurring on their messaging platform, and law enforcement will no longer be able to obtain this evidence from them. This problem won’t go away; if anything it will likely get worse.”