Apple Will Scan iPhones for Illegal Child Abuse Images, Sparking Privacy Debate
Apple announced August 6, is it planning to scan all iPhones in the United States for child abuse imagery, raising alarm among security experts who said the plan could allow the firm to surveil tens of millions of personal devices for unrelated reasons.
This is a giant step down a slippery slope by Peeping Tom. This is an abuse of technology to further whittle away at the Privacy Rights of Apple Customers.
In a blog post, the company confirmed reports saying that new scanning technology is part of a suite of child protection programs that would “evolve and expand.” It will be rolled out as part of iOS 15, which is scheduled for release sometime in August 2021.
Many Parents take pictures of their nude new-born babies. In the USA and many other countries proud parents display nude pictures of their children in their homes. Which clairvoyant Apple Employee will be able to tell which child was abused before or after the picture was taken?
Apple, which has often touted itself as a company that promises to safeguard users’ right to privacy, appeared to try and preempt privacy concerns by saying that the software will enhance those protections by avoiding the need to carry out widespread image scanning on its cloud servers.
“This innovative new technology allows Apple to provide valuable and actionable information to [the National Center for Missing and Exploited Children] and law enforcement regarding the proliferation of known CSAM,” said the company, referring to an acronym for child sexual abuse material. “And it does so while providing significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account. Even in these cases, Apple only learns about images that match known CSAM.”
The Cupertino-based tech giant said the system will utilize breakthrough cryptography technology and artificial intelligence to find abuse material when it is stored in iCloud Photos, said the firm in its blog post. The images will be matched to a known database of illegal images, the firm said, adding that if a certain number of those images are uploaded to iCloud Photos, the company will review them.
Those images—if they’re deemed illegal—will be reported to the National Center for Missing and Exploited Children. The software won’t be applied to videos, Apple added.
“Apple’s expanded protection for children is a game-changer,” John Clark, the president and CEO of the National Center for Missing and Exploited Children, said in a statement on Thursday about the initiative. “The reality is that privacy and child protection can coexist.“
But some security experts and researchers, who stressed they support efforts to combat child abuse, said the program could present significant privacy concerns.
Ross Anderson, professor of security engineering at the University of Cambridge, described Apple’s proposed system as “an absolutely appalling idea,” according to the Financial Times. “It is going to lead to distributed bulk surveillance of … our phones and laptops,” he remarked.
When news of the proposal broke on Wednesday evening, John Hopkins University professor and cryptographer Matthew Green echoed those concerns.
“This sort of tool can be a boon for finding child pornography in people’s phones,” Green wrote on Twitter. “But imagine what it could do in the hands of an authoritarian government?”
Green said that “if you believe Apple won’t allow these tools to be misused [crossed fingers emoji] there’s still a lot to be concerned about,” noting that such “systems rely on a database of ‘problematic media hashes’ that you, as a consumer, can’t review.”
The expert told The Associated Press that he’s concerned Apple could be pressured by other, more authoritarian governments to scan for other types of information.
Microsoft created PhotoDNA to assist companies in identifying child sexual abuse images on the internet, while Facebook and Google have implemented systems to flag and review possibly illegal content.
The Epoch Times has contacted Apple for comment.
3 comments:
Sounds innocent, but there is a much bigger agenda behind this— mass surveillance, totalitarian, etc.
Some software they came up with in the China plant, you’ll see a lot more of this folks, lets see the proof of justification for this software. Stay tuned!
And now a company gets to decide if a photo you take is considered child abuse. No fake fighting photos. Just for the 1-1,000,000,000 chance a photo will show a actual child being abused sexually. It’s not appropriate for everyone to be spied on the off chance a crime might be punished. Because mind you apple will only detect it after the crime. They are not preventing anything. Only using lies to tract your photo library and sell the data.
Post a Comment