The iPhone X’s facial recognition abilities continue to be found at the center of privacy concerns, with the American Civil Liberties Union and the Center for Democracy and Technology today raising questions over how “effectively” Apple can enforce certain privacy rules surrounding face scanning (via Reuters). Specifically, the privacy defending groups are worried about how certain pieces of facial data can be taken off the iPhone X by developers who seek to create entertainment features with the new smartphone’s facial software.
Facial data that is used to unlock the iPhone X — or data related to “Face ID” — is securely stored on the device itself and not in iCloud. However, Apple will let developers take certain pieces of this facial data off the user’s iPhone “as long as they seek customer permission and not sell the data to third parties,” according to terms seen in a contract by Reuters. This means that developers who want to use the iPhone X’s front-facing camera can get a “rough map” of the user’s face, as well as a “stream of more than 50 kinds of facial expressions.”
The data that developers can gather — which can then be stored on the developer’s own servers — is said to help monitor how often users blink, smile, or even raise an eyebrow. Although this data can’t unlock the iPhone X, according to documents about Face ID sent to security researchers, the “relative ease” with which developers can gain access to parts of a user’s facial data and add it to their own servers has led to the new concerns raised by the ACLU and CDT today.
That remote storage raises questions about how effectively Apple can enforce its privacy rules, according to privacy groups such as the American Civil Liberties Union and the Center for Democracy and Technology. Apple maintains that its enforcement tools – which include pre-publication reviews, audits of apps and the threat of kicking developers off its lucrative App Store – are effective.
[…]But the relative ease with which developers can whisk away face data to remote servers leaves Apple sending conflicting messages: Face data is highly private when used for authentication, but it is sharable – with the user’s permission – when used to build app features.
According to Jay Stanley, a senior policy analyst at the ACLU, the privacy issues surrounding facial recognition in the context of unlocking a smartphone “have been overblown.” Stanley explained, “The real privacy issues have to do with access by third-party developers.” The experts concerned about Face ID in this context are also not worried about “government snooping,” but more about marketers and advertisers tracking how a user’s expression reacts to their ads.
Apple has strict policies against developers using face data for advertising and marketing, but those concerned groups cited worry about the company’s “inability to control what app developers do with face data once it leaves the iPhone X.” Stanley said that “the hard part” for Apple will come from having to find and catch the apps that might be violating these policies, meaning that the big household names probably won’t be of concern to Apple, “but there’s still a lot of room for bottom feeders.”
Now that the iPhone X is in the hands of reviewers, many have said that Face ID works quite well in many different conditions. Some outlets have taken to try and fool Face ID with large pieces of clothing, sunglasses, and “twin tests,” the last of which have come back with mixed results. In its ongoing efforts to reassure customers of Face ID’s security and privacy, Apple released an in-depth security white paper in September to highlight and explain some of these features of Face ID.
Discuss this article in our forums