Translate News

Apple plans to scan iPhone’s for child abuse content


Aug 5 - Apple Inc is planning to install a software on U.S. iPhones that will scan for child sexual abuse material (CSAM), including the media content related to child pornography., the Financial Times reported on Thursday, citing people familiar with the matter.

Cybersecurity expert Matthew Daniel Green, who works as an Associate Professor at the Johns Hopkins Information Security Institute in the US, tweeted about Apple's plans to launch the client-side system to detect child abuse images from the iPhone. He said that the under-developing tool could eventually be a “key ingredient” in adding surveillance to encrypted messaging systems.



“The way Apple is doing this launch, they're going to start with non-E2E [non-end-to-end] photos that people have already shared with the cloud. So it doesn't ‘hurt' anyone's privacy. But you have to ask why anyone would develop a system like this if scanning E2E photos wasn't the goal,” Green said in a detailed thread on Twitter.

Apple may raise user concerns through its new tool as even if there would be enough layers to protect misuse, it may turn up false positives. Governments may also be able to abuse the system to go beyond looking for illegal child content and search for media that could push public attitudes toward political engagements.

Earlier this week, the company had elaborated its planned system, called "neuralMatch," to academics in the United States via a virtual meeting, the report said, adding that its plan could be publicized widely as soon as this week.