Software Integrity

 

In the name of data security, Apple is fighting back

It would appear that Apple’s security strategy to protect user data is so effective that even the FBI can’t decrypt an iPhone in the midst of a terror attack investigation. The ruling came down yesterday from the federal magistrate, ordering Apple to help the FBI unlock the iPhone used by San Bernardino shooter, Syed Farook. Investigators believe unlocking the phone will lead them to more clues as to why Farook, along with his wife, killed 14 people and injured dozens at a holiday party in December of last year.

Apple’s not giving in so easily. But why?

Apple CEO, Tim Cook, has made it clear that he has no intention of complying with the order in a statement released yesterday. Within the statement, Cook writes that Apple doesn’t have a solution readily available, and that building the backdoor to the iPhone, as demanded by authorities, is “too dangerous to create.”

To further understand the security implications of this precedent, we turned to a few of our own mobile security experts to explain.

Let’s use the TSA luggage locks as an example. A TSA “Recognized” lock enables passengers to lock their bags, but gives any TSA agent access to open the lock and search your luggage. It enables a master key regardless of the uniqueness of your key.

It is reasonable to assume at least one TSA master key is now in the hands of someone outside of the TSA. This means that the master key could be replicated and used by anyone, including those with nefarious intent, who possesses a copy of the key.

If Apple were to build the OS version the FBI requestsessentially a master keyit will only be a matter of time before the techniques used will find their way into the open and are used by those with nefarious intent.

– Jim Ivers, Senior Director of Marketing

Open the door for some. Open the door for all?

If Apple were to provide backdoor access past encryption for the FBI, the concern is that they would essentially be offering hackers access to user data.

But is there an alternate solution?

The device in question is an iPhone 5C that does not have the Secure Enclave that’s part of newer iPhones. Getting access to the data on the device might be in the public’s best interest.

In this case, Apple can probably create a modified version of iOS that will only run on that particular device that will allow law enforcement to brute force the PIN/password used to protect the device. Even if that version of iOS gets in the wrong hands, it should not be usable on any other devices.

The problem is the precedent that this sets.

With the newer generation of Apple devices, installing a modified version of iOS would not help because the brute force protections are implemented in the Secure Enclave itself. This is good because if an iOS device is stolen, the attacker cannot modify the operating system on the device to brute force the user’s PIN/password.

Will the US government require Apple to build a backdoor into all Apple devices that takes away this protection and makes all users’ devices less secure? When it comes across a complex password that it cannot brute force, will it require Apple to limit PIN/password strength? When it decrypts data on a device and finds a messaging application that encrypts stored data using a separate password, will it require that application’s author to build in a backdoor as well? If so, all of these limitations and backdoors will make all users’ data significantly less secure.

– Amit Sethi, Senior Principal Security Consultant

Apple has five business days to notify the court if the corporation believes the order is unreasonably burdensome to their operations.