New Delhi: Apple Inc. has come up with a new tool that will scan iOS phones in the US for images of child sexual abuse, media reports said. While child protection groups have welcomed the new technology, many security researchers have pointed out that this can be misused.


Apple also has plans to scan encrypted messages on iPhones for sexually explicit content as an additional child safety measure alarming privacy advocates, AP reported.


ALSO READ: Hindu Temple Attack In Pakistan: PM Imran Khan Commits To 'Restore Mandir' As India Expresses 'Grave Concerns'


Called 'neuralMatch', the tool detects known images of child sexual abuse. It will scan images before it is uploaded to iCloud. If it finds a match, the photo will be reviewed by a human. And if pornography is confirmed, the account of the user will be disabled, and the US National Center for Missing and Exploited Children will be notified.


There are, however, concerns regarding privacy. Matthew Green, a top cryptography researcher at Johns Hopkins University, told AP that innocent people could be framed by sending them images that could trigger the system, fool Apple’s algorithm, and alert law enforcement. He said such systems can be tricked "pretty easily". 


Various tech companies such as Microsoft, Facebook, Google and others have been for years sharing digital fingerprints of known child sexual abuse images. Apple has used those images to scan user files in iCloud service for child pornography. The iCloud service is not as securely encrypted as its on-device data.


But Apple, one of the major tech firms to embrace 'end-to-end' encryption, has been under pressure from the authorities for access to information so it can help solve crimes such as terrorism or child sexual exploitation. 


Researchers, meanwhile, are concerned about government surveillance, especially of protesters or dissidents. The Center for Democracy and Technology, a Washington-based non-profit, has urged Apple to abandon the changes, which it claimed would destroy the company’s 'end-to-end encryption' guarantee.


Scanning of messages for sexually explicit content on devices effectively breaks the security, CDT was quoted saying by AP. 


Child rights activists and protection groups have, however, praised 'neuralMatch'.


In a statement, John Clark, the president and CEO of National Center for Missing and Exploited Children, said: "Apple’s expanded protection for children is a game-changer.”


He said these new safety measures "have the lifesaving potential for children” since a large number of people use Apple products. 


According to Julia Cordua, CEO of non-profit Thorn that uses technology to help protect kids from sexual abuse, Apple’s tech balances “the need for privacy with digital safety for children”.