One of the FBI’s Major Claims in the iPhone Case is Fraudulent
How the FBI describes the “auto-erase” feature
Let’s look at how the FBI describes the situation. The court order‘s first and most urgently phrased request is to ask Apple to “bypass or disable the auto-erase function whether or not it has been enabled.”
A few days after the court order was issued, but before Apple had formally responded, the government filed a strongly worded motion to compel, which contained this description of the feature:
The FBI has been unable to make attempts to determine the passcode to access the SUBJECT DEVICE because Apple has written, or “coded,” its operating systems with a user-enabled “auto-erase function” that would, if enabled, result in the permanent destruction of the required encryption key material after 10 failed attempts at the [sic] entering the correct passcode (meaning that, after 10 failed attempts, the information on the device becomes permanently inaccessible)…
In sum, the government seeks the ability to make multiple attempts at determining the passcode without risk that the data subject to search under the warrant would be rendered permanently inaccessible after 10 wrong attempts.
To add urgency to their attempt to compel Apple to abuse their software signing keys, the FBI is painting a picture of “permanently inaccessible” data. But if its agents are doing their job, that’s just not the case.
How the “auto-erase” feature actually works
Here’s where the technical details come in. The iPhone protects its user’s data with a complex hierarchy of cryptographic keys. Some data is protected by multiple keys. Imagine a pile of letters and photos placed inside a locked box, with the box itself placed inside a locked filing cabinet. You’d have to have keys to the filing cabinet and the box to read any of the letters or see any of the photos. If either of these keys is destroyed, the letters and photos are lost forever.
When iOS decides to wipe out user data because the passcode guess limit has been reached (or for any other reason), it doesn’t actually erase all the data from its underlying storage; that would actually take several minutes. Instead, it just destroys one of the keys that protects the data, rendering that data permanently unreadable. The key that is erased in this case is called the “file system key”—and (unlike the hardwired “UID” key that we discussed in our previous blog post) it is not burned into the phone’s processor, but instead merely stored in what Apple calls “Effaceable Storage,” which is just a term for part of the flash memory of the phone designed to be easily erasable. Apple’s iOS Security Guideexplains:
Since it’s stored on the device, this key is not used to maintain the confidentiality of data; instead, it’s designed to be quickly erased on demand (by the user, with the “Erase all content and settings” option, or by a user or administrator issuing a remote wipe command…. Erasing the key in this manner renders all files cryptographically inaccessible.
The file system key is like the key to the filing cabinet in our example: a small thing that is easy to destroy, which disables access to the rest of the information.
Why the FBI can easily work around “auto-erase”
So the file system key (which the FBI claims it is scared will be destroyed by the phone’s auto-erase security protection) is stored in the Effaceable Storage on the iPhone in the “NAND” flash memory. All the FBI needs to do to avoid any irreversible auto erase is simple to copy that flash memory (which includes the Effaceable Storage) before it tries 10 passcode attempts. It can then re-try indefinitely, because it can restore the NAND flash memory from its backup copy.
What’s really going on here?
If this generally useful security feature is actually no threat to the FBI, why is it painting it in such a scary light that some commentators have even called it a “doomsday mechanism”? The FBI wants us to think that this case is about a single phone, used by a terrorist. But it’s a power grab: law enforcement has dozens of other cases where they would love to be able to compel software and hardware providers to build, provide, and vouch for deliberately weakened code. The FBI wants to weaken the ecosystem we all depend on for maintenance of our all-too-vulnerable devices. If they win, future software updates will present users with a troubling dilemma. When we’re asked to install a software update, we won’t know whether it was compelled by a government agency (foreign or domestic), or whether it truly represents the best engineering our chosen platform has to offer.
In short, they’re asking the public to grant them significant new powers that could put all of our communications infrastructure at risk, and to trust them to not misuse these powers. But they’re deliberately misleading the public (and the judiciary) to try to gain these powers. This is not how a trustworthy agency operates. We should not be fooled.