Apple Uses Surveillance To Detect Child Abuse
Apple has announced a new system for checking photos for abuse pictures to be carried out on a country-by-country basis. This process will be included in iPhones that will automatically scan devices to identify if they contain media featuring child sexual abuse. This initiative is part of a range of child protection features to be launched later this year in the US, which will be implemented through updates to iOS 15 and iPad OS and which will compare the images on users' devices to a database of known abuse images
Apple said it would implement to use the new system to screens photos for such images before they are uploaded from iPhones in the United States to its iCloud storage. Child safety groups praised Apple as it joined Facebook Inc, Microsoft Corp, Alphabet Inc's Google in taking such measures.
Detection of child abuse image uploads sufficient to guard against false positiveswill trigger a human review of and report of the user to law enforcement, Apple said. It said the system is designed to reduce false positives to one in one trillion. Child safety groups praised Apple as it joined Facebook, Microsoft, Alphabet and Google in taking such measures.
Apple's photo check on the iPhone itself raised concerns that the company is probing into users' devices in ways that could be exploited by governments. Many other technology companies check photos after they are uploaded to servers.
Apple's iPhones, iPads, and Macs will now also integrate the new system that checks images uploaded to iCloud in the US for known child sexual abuse images. That feature will use a cryptographic process that takes place partly on the device and partly on Apple's servers to detect those images and report them to the National Center for Missing and Exploited Children, or NCMEC, and ultimately US law enforcement.
Apple has announced that it would make plans to expand the service based on the laws of each country where it operates.
The company said nuances in its system, such as "safety vouchers" passed from the iPhone to Apple's servers that do not contain useful data, will protect Apple from government pressure to identify material other than child abuse images. Apple will also implement a human review process that acts as a backstop against government abuse. The company will not pass reports from its photo checking system to law enforcement if the review finds no child abuse imagery.
Regulators are increasingly demanding that tech companies do more to take down illegal content. For the past few years, law enforcement and politicians have wielded the scourge of child abuse material to decry strong encryption, in the way they had previously cited the need to curb terrorism.
A few resulting laws, including in Britain, could be used to force tech companies to act against their users in secret.Facebook's WhatsApp, the world's largest fully encrypted messaging service, is also under pressure from governments that want to see what people are saying, and it fears that will now increase. WhatsApp chief Will Cathcart tweeted against Apple's plans for the new architecture.
"We've had personal computers for decades, and there has never been a mandate to scan the private content of all desktops, laptops or phones globally for unlawful content," he wrote. "It's not how technology built in free countries works.... This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable”, he said.
Apple's experts argued that they were not really going into people's phones because data sent on its devices must clear multiple hurdles. For example, banned material is flagged by watchdog groups, and the identifiers are bundled into Apple's operating systems worldwide, making them harder to manipulate.
Critics suspect more complex motives in Apple's approach. They say the great technical lengths Apple has gone to to check images on a user's device, despite that process's privacy protections, only really make sense in cases where the images are encrypted before they leave a user's phone or computer and server-side detection becomes impossible.
In that case, Apple might easily extend the detection system to photos on users' devices that aren't ever uploaded to iCloud, a kind of on-device image scanning that would represent a new form of invasion into users' offline storage.
Reuters: Wired: Livemint: NDTV: Independent: Yahoo:
You Might Also Read:
British Law To Protect Online Users: