Several frameworks have been proposed to evaluate the security of biometric systems. Popular ones include the simpler Ratha’s framework  and the enhanced Bartlow and Cukic framework .
To employ these frameworks to evaluate iPhone X’s biometric security, we need a lot of data points that we don’t have yet. We won’t speculate on the iPhone X facial biometrics implementation. However, we’d like to discuss the tangential topic of evaluating facial biometric security in general and draw a few parallels that may apply to iPhone X.
At Synopsys, we’ve evaluated several mobile applications’ facial biometric security implementations. Here we’ll briefly discuss a few attacks we explored during the assessments:
Liveness detection tests whether the subject is an inanimate object or a human. To perform liveness detection, the applications we tested required the user to turn their head left and right. We removed this detection just by hooking the mobile application and disabling this check at run time. After we removed the check, a static photo was sufficient to bypass the biometric authentication. Note that this technique can be extended to hook the method that returns the authentication status to completely bypass the authentication system. Such an attack might work only on biometric systems that don’t derive a key for offline data protection from the biometrics.
Run-time hooking attacks might apply only to mobile applications implementing facial biometric systems without support from the underlying operating system—not to Face ID. Mounting a similar attack on iPhone X might require hardware attacks owing to the phone’s use of the TrueDepth camera system.
Brute force protection prevents attackers from making an excessive number of authentication attempts against the biometric system. The applications we tested stored the number of authentication attempts in a location we were able to reset to 0 by using a script. In combination with the liveness detection bypass, we were also able to bypass biometric authentication completely using a photo of the user extracted from social media.
Client-side controls such as these are typically bypassable. Hardware security controls raise the bar but they aren’t insurmountable. iPhones historically have protections against brute force, and they’ve been bypassed. iPhone X protections against this attack surface are unknown and yet to be evaluated.
Implementations that we tested didn’t derive encryption keys from any user’s biometrics. This resulted in protected data accessible without requiring biometric authentication. If you’re looking to derive a biometric key, consider reading up on key binding biometric cryptosystems and key generation biometric cryptosystems.
iPhone X uses Face ID for payment and for protecting offline data. In earlier versions of iPhone, Touch ID was used to unlock the master key used to protect offline data. Face ID may play a similar role. Note that in the case of Touch ID, Apple doesn’t use the user’s biometrics to derive a key. Touch ID security depends on the isolation of Secure Enclave from the rest of the operating system. This implies that a compromise of OS doesn’t affect Secure Enclave without an additional exploit against the enclave. It wouldn’t be too surprising if Face ID adopted a similar model for unlocking the master key.
It’s as simple as it sounds: Bypass authentication by using a high-resolution photo on an iPad tilted left and right. It was hard to reproduce this attack consistently, but we got it to work. With the brute force protection disabled, we were able to log in within 5–15 attempts. This is also unlikely to work on implementations that have depth-sensing cameras, such as Windows Hello and Face ID.
This was the most reliable attack in some of our tests. An attacker can construct a 3D model, even with a low-resolution photo (from social media), and then turn it left and right. Demo versions of FaceGen Modeller to generate the model and FaceGen Artist to move the model left and right were sufficient to bypass biometric authentication without tampering with the mobile application. Even if the biometric system uses life sign protections that require the user to perform specific facial movements, it might be possible to bypass it using FaceRig. Thus allowing the user to video chat based on models constructed from FaceGen with a complete expression palette.
Ideally, neither tilting nor 3D model attacks should work against iPhone X, owing to its depth-sensing camera.
A few attacks that we didn’t get a chance to execute for varying reasons but are worth considering include face-spoofing attacks and face masks constructed from 3D models. Apple claims it has protected iPhone X against face mask attacks using trained neural network engines that recognize these masks.
Windows Hello authentication using Intel’s RealSense held its ground against twins; however, Apple hasn’t claimed the same for iPhone X and recommends that twins use a passcode (it could be a joke…?). iPhone X uses a similar depth-perceiving camera as Windows Hello devices, but its protections against evil twins and 3D-printed heads and masks are yet to be fully understood. I predict that as these systems evolve, compromising them may increasingly require tampering with the device (e.g., disabling liveness detection using hardware attacks) before they can be bypassed using a photo, model, or mask.