Facial Recognition Works on iPhone X. Sometimes.
The iPhone X might be the future of Apple’s smartphone design, but its lauded Face ID facial recognition system has an issue with people under 13: it’s much more difficult to tell them apart.
In a security guide which was published Wednesday 27th September, Apple recommends that children under the age of 13 do not use Face ID due to the probability of a false match being significantly higher for young children. The company said this was because “their distinct facial features may not have fully developed”.
While few young children are likely to be given a £999 iPhone, false matches are also more likely for twins and siblings. In all those situations, the company recommends concerned users disable Face ID and use a passcode instead.
For most users, those over 13 without “evil twins”, as Apple’s head of iOS Craig Federighi describes them, the bigger concern is deliberate attacks. Touch ID, Apple’s fingerprint sensor, was famously bypassed just two days after it was launched in the iPhone 5S, using a fake fingerprint placed over a real finger.
With Face ID, Apple has implemented a secondary system that exclusively looks out for attempts to fool the technology. Both the authentication and spoofing defence are based on machine learning, but while the former is trained to identify individuals from their faces, the latter is used to look for telltale signs of cheating.
“An additional neural network that’s trained to spot and resist spoofing defends against attempts to unlock your phone with photos or masks,” the company says. If a completely perfect mask is made, which fools the identification neural network, the defensive system will still notice – just like a human.
Apple is also confident that it won’t fall prey to issues of algorithmic bias that have plagued many attempts to use neural networks at scale. High-profile examples of such failures include the photo-labelling system that ltagged black people as gorillas, or the word-association model which states that men are computer programmers and women are homemakers.
Whenever its initial training exposed a demographic shortcoming, Apple says, it “augmented the studies as needed to provide a high degree of accuracy for a diverse range of users”. Time, and millions of people around the world using the technology, will tell whether the effort worked, but the company sounds confident.
One area the system will struggle with, however, is facial coverings. Apple says that “Face ID is designed to work with hats, scarves, glasses, contact lenses and many sunglasses,” but ultimately two things dictate whether or not it has a chance of success.
The first is whether the coverings are transparent to infrared light, and the second whether the system can see the eyes, nose and mouth. While some fabrics are more transparent to, infrared than they may seem, that means iPhone users who cover their faces may be forced to rely on a passcode when out and about.
Separately, Apple has also confirmed that the depth-sensing technology included in the iPhone X is not allowed to be used by developers to create their own facial biometrics, a possibility which had concerned many privacy activists.
The depth sensor data is not directly available to developers, but the camera API now allows them to receive a pixel-by-pixel measure of how far features in an image are from the lens, a system intended to be used to enable image manipulation such as Apple’s own portrait mode.
That could theoretically be used to build a standalone authentication feature, albeit one that is less precise than Apple’s own, but the company has updated its App Store policies to prevent developers from attempting to do so.
“You may not attempt, facilitate, or encourage others to identify anonymous users or reconstruct user profiles based on data collected from depth and/or facial mapping tools,” the company’s developer guidelines now state.
You Might Also Read: