Apple is planning to scan U.S. iPhones for images of child abuse, drawing applause from slotxo child protection groups but raising concern among some security researchers that the system could be misused by governments looking to surveil their citizens.
Apple said its messaging app will use on-device machine learning to warn about sensitive content without making private communications readable by the company. The tool Apple calls “neuralMatch” will detect known images of child sexual
abuse without decrypting people's messages. If it finds a match, the image will be reviewed by a human who can notify law enforcement if necessary.But researchers say the tool could be put to other purposes such as government surveillance of dissidents or protesters.
Matthew Green, a security professor at Johns Hopkins University who earlier posted his concerns on Twitter, told The Financial Times that Apple's move will “break the dam — governments will demand it from everyone.”
Tech companies including Microsoft, Google Facebook and others have for years been sharing “hash lists" of known images of child sexual abuse. Apple has also been scanning iCloud, which unlike its messages is not end-to-end encrypted, for such images.