(Bloomberg) -- Apple Inc. is delaying a system that would have scanned customers’ photos for signs of child sex abuse after fierce criticism from privacy advocates, who feared it could set the stage for other forms of tracking.
The company had announced the feature in early August, along with other tools meant to protect children and root out illicit pornography, and quickly faced concerns that it would create a backdoor through the company’s highly prized privacy measures. Apple scrambled to contain the controversy in the following weeks, saying it would tap an independent auditor to oversee the system, but the outcry persisted.
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material,” Apple said in a statement Friday. “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
The backlash has added to growing scrutiny of Apple in recent months. Earlier this week, the company agreed to change its App Store policies to address criticism that it’s anti-competitive. And employees have become increasingly vocal about problems within the company, including what they say is a lack of pay equity. The U.S. National Labor Relations Board is currently looking into two complaints from workers that originated with concerns about workplace safety and a lack of pay transparency.
Apple had planned a trio of new tools designed to help fight child sex abuse material, or CSAM. They included using the Siri digital assistant for reporting child abuse and accessing resources related to CSAM, as well as a feature in Messages that would scan devices operated by children for incoming or outgoing explicit images.
The third feature was the most controversial: one that would analyze a user’s library in iCloud Photos for explicit images of children. If a customer was found to have such pictures in their library, Apple would be alerted, conduct a human review to verify the contents, and then report the user to law enforcement.
Privacy advocates such as the Electronic Frontier Foundation warned that the technology could be used to track things other than child pornography, opening the door to “broader abuses.” They weren’t assuaged by Apple’s plan to bring in the auditor and fine-tune the system, saying the approach itself can’t help but undermine the encryption that protects users’ privacy.
In its attempts to defend the new CSAM feature, Apple coached staff about how to field questions on the topic. It also said that the system would only flag cases where users had about 30 or more potentially illicit pictures.
Apple also is far from alone in taking such steps. Facebook Inc. has long had algorithms to detect such images uploaded to its social networks, and Google’s YouTube analyzes videos on its service for explicit or abusive content involving children. Adobe Inc. has similar protections for its online services.
Apple’s CSAM feature would work by assigning a so-called hash key to each of the user’s images and comparing the keys with ones assigned to images within a database of explicit material. Some users have been concerned that they may be implicated for simply storing images of, say, their baby in a bathtub. But a parent’s personal images of their children are unlikely to be in a database of known child pornography, which Apple would cross-reference as part of its system.
Apple also tried to tamp down concerns about governments spying on users or tracking photos that aren’t child pornography. It said its database would be made up of images sourced from multiple child-safety organizations -- not just the National Center for Missing & Exploited Children, as was initially announced. The company also plans to use data from groups in regions operated by different governments and said the independent auditor will verify the contents of its database.
It also only affects photos that customers upload to their iCloud accounts. Apple has said it would refuse any requests from governments to use its technology as a means to spy on customers.
The feature had been slated to go in effect before the end of the year, potentially overshadowing a flurry of Apple product announcements that are expected in the coming weeks. The company is rolling out updated iPhones, iPads, AirPods and Macs, as well as a new larger Apple Watch, people familiar with the matter have said.
(Updates with more on technology in eighth paragraph.)
More stories like this are available on bloomberg.com
Subscribe now to stay ahead with the most trusted business news source.
©2021 Bloomberg L.P.