by Barbara Ortute and Frank Bajak | The Associated Press
Apple unveiled plans to scan US iPhones for images of child sexual abuse, drawing applause from child protection groups, but raising concerns among some security researchers that the system may be misused by governments to survey their citizens can go.
Apple said its messaging app will use on-device machine learning to warn about sensitive content without making private communications readable by the company. Apple’s tool called “NeuralMatch” will detect known images of child sexual abuse without decrypting people’s messages. If it finds a match, the image will be reviewed by a person who can notify law enforcement if necessary.
But researchers say the device could be used for other purposes, such as government surveillance of dissidents or protesters.
Johns Hopkins’ Matthew Green, a top cryptography researcher, was concerned that it could be used to send harmless but malicious images designed to appear as matches for child porn to innocent people, Apple’s algorithms. fooling the . and alerting law enforcement – essentially preparing people . “Researchers are able to do this very easily,” he said.
Tech companies including Microsoft, Google, Facebook and others have been sharing “hash lists” of known images of child sexual abuse for years. Apple is also scanning user files stored in its iCloud service, which isn’t as securely encrypted as its messages, for images that aren’t.
Some say the technology could leave the company vulnerable to political pressure in authoritarian states such as China. “What happens when the Chinese government says, ‘Here’s a list of files we want you to scan,'” Green said. “Does Apple say no? I hope they say no, but their technology will say no.”
The company has been under pressure from governments and law enforcement to allow surveillance of encrypted data. Coming up with security measures required Apple to strike a delicate balance between cracking down on the exploitation of children while maintaining its high-profile commitment to protecting the privacy of its users.
Apple believes it has achieved that feat with technology it has developed in consultation with several leading cryptographers, including Stanford University professor Dan Boneh, whose work has won the Turing Prize, Which is often called the technology version of the Nobel Prize.
The computer scientist, who more than a decade ago invented PhotoDNA, a technology used by law enforcement to identify online child pornography, acknowledged the potential for abuse of Apple’s system, but said it could be used for child pornography. There is much more to coping with sexual abuse than the inevitability.
“It’s possible? Of course. But is it something I’m worried about? No,” said Hani Farid, a researcher at the University of California at Berkeley who argues that devices designed to secure against various threats “This type of mission creep” is not seen in many of the other programs that have taken place. For example, WhatsApp offers users end-to-end encryption to protect their privacy, but employs a system to detect malware and warn users not to click on harmful links.
Apple was one of the first major companies to adopt “end-to-end” encryption, in which messages are scrambled so that only their senders and recipients can read them. However, law enforcement has long pressed for access to that information to investigate crimes such as terrorism or child sexual abuse.
“Apple’s expanded protections for children are a game changer,” John Clarke, president and CEO of the National Center for Missing and Exploited Children, said in a statement. “With so many people using Apple products, these new safeguards have life-saving potential for children who are being lured online and whose horrific images are being circulated in child sex abuse material.”
Thorn CEO Julia Cordua said Apple’s technology balances “the need for privacy with digital security for children.” Thorne, a nonprofit founded by Demi Moore and Ashton Kutcher, uses technology to help protect children from sexual abuse by identifying victims and working with tech platforms.
AP Technology writer Mike Lidtke contributed to this article.