$150 Mask Fools iPhone X Facial Recognition
Researchers at Vietnamese security firm Bkav say they have broken the iPhone X Face ID facial recognition security feature, just a week after launch, with a mask they built for $150.
The mask was custom-made using 2D and 3D printers, with a silicone nose made by hand. It also used “special processing on the cheeks and around the face, where there are large skin areas, the firm said.
Face ID uses artificial intelligence to distinguish real faces from images, videos or masks, but it “learns” a face over time. Each capture hones the AI’s ability to distinguish the owner from an imposter. In a Q&A, Bkav said that it understands “how AI of Face ID works and how to bypass it,” but hasn’t given specifics of how exactly it did it—nor whether the iPhone X was “imprinted” with the mask from the beginning.
It only acknowledged that when Face ID was set up on the fooled device, “it learns from human face, just like normal.”
Other researchers and Apple itself have tried—and failed—to fool Face ID using a mask. “[Apple engineering teams] have even gone and worked with professional mask makers and makeup artists in Hollywood to protect against these attempts to beat Face ID,” said Apple senior vice president Phil Schiller, at the iPhone X launch event. “These are actual masks used by the engineering team to train the neural network to protect against them in Face ID. It’s incredible!”
Bkav has declined to explain why its efforts succeeded where others’ did not—thus, it’s unclear how the firm did it, or how momentous the “hack” really is.
Paul Norris, senior systems engineer for EMEA at Tripwire, pointed out via email that Bkav also hasn’t said how much tweaking was necessary before it got the mask to work; if the attempt was not immediately successful, it makes the chances of a successful attack very low indeed. That’s because to use Face ID, there must be a passcode set up on the phone, which is required as additional security validation when the device has just been turned on or restarted; after five unsuccessful Face ID attempts; or if the device hasn’t been unlocked for more than 48 hours.
“Apple will disable the Face ID after five attempts, and force the user to enter a passcode, which should be secure,” Norris said. “In order to compromise Face ID authentication, the attacker would have to have a detailed map of the face of the user, create a mask that would map the exact details of the victim’s face, unlock the phone within five attempts, and do all of this within 48 hours. This seems like an unlikely sequence of events.”
And, of course, the phone itself would need to be physically stolen in the first place.
“It’s important to note that the attacks being talked about are individual bespoke attacks that must be built and executed against each victim separately,” said Terry Ray, CTO of Imperva, via email. “This is in addition to stealing the individual’s phone and getting access to it before the owner can remotely wipe the device. Is your data so valuable that someone would go to this effort? For the vast majority of us, the answer is definitely no. However, for those few who feel they may be at threat, such a Mission Impossible-style attack might be possible.”
Bkav did say that attacks would likely be executed on high-profile targets: “FBI, CIA, country leaders, leaders of major corporations, etc. are the ones that need to know about the issue, because their devices are worth illegal unlock attempts,” it said. “Exploitation is difficult for normal users, but simple for professional ones.”
Yet, “Apple’s facial recognition was never intended to be a security measure for strong authentication,” said Josh Mayfield, director of product marketing at FireMon. “The hype around the automated log-in from staring at one’s phone was meant to give the user ease, rather than hardened security to prevent unauthorized access. The trouble with facial recognition is that too many humans have defining characteristics that cannot be dissected by a machine—we look too similar. The reason CAPTCHA is so effective is that there are subtleties that only a human eye can assess and accurately confirm.”
He added, “Strong authentication cannot be faked, gamed, or manipulated. Apple’s facial recognition begins with the opening assumption that the user gazing at the screen is likely to be the correct user. From there, the recognition system only seeks to confirm its assumption…never to seek to prove its assumption wrong.”
In other words, high-profile users at risk for such targeted attacks would likely not be using Face ID in the first place.
“Each person must decide which is the highest priority for them, convenience or security, and weigh the importance of each against the technology they choose to secure their personal data,” Imperva’s Ray said. “If convenience is more important, Face ID may be your choice. On the inverse, if security is your priority, until more is tested against Face ID, I’d suggest using only a passcode, all the time.”