Chat apps would be required to scan private messages for child abuse under new EU rules

The European Commission has proposed a contentious new regulation that would require chat apps like WhatsApp and Facebook Messenger to scan users' private messages for CSAM and "grooming" behavior. The proposal is similar to Apple's plans from last year, but critics say it is far more intrusive.

Following the leak of a draft of the regulation earlier this week, privacy experts slammed it. Cryptography professor Matthew Green tweeted, "This document is the most terrifying thing I've ever seen." "It describes the most advanced mass surveillance apparatus ever deployed outside of China and the Soviet Union." "I'm not exaggerating."

"This looks like a shameful general #surveillance law entirely unfitting for any free democracy," said Jan Penfrat of digital advocacy group European Digital Rights (EDRi). (A comparison of the PDFs reveals only minor differences between the leaked draft and the final proposal.)

The rule would impose a number of new obligations on "online service providers," which includes app stores, hosting companies, and any other provider of "interpersonal communications service."

Communication services like WhatsApp, Signal, and Facebook Messenger would be subject to the most stringent requirements. If a company in this group receives a "detection order" from the EU, it must scan selected users' messages for known child sexual abuse material, as well as previously unseen CSAM and any messages that may constitute "grooming" or "solicitation of children." Machine vision tools and AI systems would be required to analyze the context of pictures and text messages in these last two categories of content.

Individual EU countries would issue "detection orders," which the Commission claims would be "targeted and specified" to reduce privacy violations. However, the regulation is vague about how these orders would be applied, such as whether they would be limited to individuals or groups or applied to much broader categories.

According to privacy experts, the proposal could seriously weaken (or even break) end-to-end encryption. Although the proposal does not explicitly call for the end of encrypted services, experts say that requiring companies to install any software the EU deems necessary to detect CSAM in their systems would effectively render robust end-to-end encryption impossible. Because of the EU's influence on global digital policy, these measures could spread around the world, including to authoritarian countries.

"The only way to accomplish what the EU proposal seeks is for governments to read and scan user messages on a massive scale," Joe Mullin, senior policy analyst at the digital rights group Electronic Frontier Foundation, told CNBC. "If passed into law, the proposal would be a disaster for user privacy not just in the EU, but worldwide."

The Commission's decision to target previously unknown examples of CSAM as well as "grooming" behavior has been criticized in addition to the encryption issues. The use of algorithmic scanners to find this content would be required, according to the Commission, to protect the anonymity of targeted users. However, experts say such tools are prone to error and could lead to the government spying on innocent people.