Apple has promised to fight a court ruling ordering them to develop a backdoor that will give the FBI access to the San Bernardino shooter's iPhone. But the case itself more than a technical undermining of basic security, there are serious legal issues at play here.
On Tuesday, a ruling signed by U.S. Magistrate Sheri Pym ordered Apple to assist the FBI in bypassing or disabling the auto-erase function in iOS, so the agency may access the phone recovered in the San Bernardino shooting case.
The iPhone was issued to Syed Rizwan Farook by the San Bernardino County Department of Public Health. Farook, along with his wife, killed 14 people last December during the San Bernardino terror attacks. What Farook did was evil, but that isn't what's being questioned by the courts. At issue is the iPhone that was recovered by law enforcement.
When the phone was issued for business use, the auto-erase feature was enabled. This is because the security team at the agency believed it to be an important layer of protection to enable.
What the court wants Apple to do is create a tool or process that would enable the FBI to bypass a security feature that protects the iPhone against the exact operation the FBI wants to attempt – a bruteforce attack against the passcode.
The court order puts Apple in a sticky spot, because it means they'll have to do the very thing they've been fighting against for quite some time now – create a backdoor into their products for law enforcement usage.
Suni Munshani, the CEO of Protegrity, said the public needs to avoid jumping on the fear of terrorism bandwagon and abandoning their rights to privacy. It is regrettable, he said, that the actions of a few terrorists "have brought us to think we must decide between personal privacy versus national security."
"That is a false choice because if we give the government a back door to override the very protections customers demand it won’t be long before criminals and less well-meaning governments use those back doors against all of us. Besides, if you make it more difficult for terrorists and other criminals to use one kind of technology, they will just move on to something else."
In an open letter to the public, Apple's Tim Cook called the ruling an "unprecedented step" that threatens the security of Apple's customers.
He said the company opposes the order and that it has implications far beyond the legal case at hand.
"Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
"The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control."
While Apple might be able to assist the FBI in some way, given that the iPhone in question is a 5C, which lacks many of the additional security layers available to consumers using the latest iOS devices, doing so could cause problems down the line.
"[Apple] no doubt is concerned about setting precedent—that might be used against them in future cases. Legally, this case presents a case of first impression in the sense that we lack well-established guidelines in deciding how to balance digital security against national security in this specific context," said Dr. Wendy Patrick, Business Ethics Lecturer at San Diego State University.
"Accordingly, the judge (an ex-federal prosecutor herself) gave Apple five days to argue that her ruling would be unduly burdensome for them to comply with—which seems to be the direction they are heading."
But others have taken a different stance on the issue, going so far as to say Cook's statements were less than genuine.
"But Cook is being disingenuous. Apple is not being asked to hand over a backdoor or a master key. It's not being asked to decrypt Farook's iPhone. Rather, Apple is being asked to let FBI technicians decrypt the phone themselves. That's very different," wrote Paul Wagenseil, a senior editor at Tom's Guide.
Other experts have made similar arguments. Robert Graham, of Errata Security, blogged that Apple isn't being ordered to develop a backdoor: "the court order explicitly wants Apple to limit the special software for only this phone, so it wouldn't be something the FBI could use on other phones."
Lance James, Chief Scientist at Flashpoint, argued it's apples and oranges, saying that Tim Cook's letter was pre-mature.
James' take is that the judge asked Apple to provide assistance, and along with others who have taken this stance, the assistance rendered only applies to a single phone. Even if Apple modifies the firmware with a key, they're not required to give that software to the FBI.
"All companies have a way to modify their own devices and software - it’s like car companies having spare keys for individual cars… they exist. They don’t have to provide a back door to the FBI - they can provide a sub key, individual key, or Apple can take the device and unlock it and give them the data they requested," James said.
"This argument is not the proper argument. If they want to say it’s about privacy, Apple shouldn’t have the backdoor either. It would have been better for Tim Cook and Apple to say: We simply just don’t have a way to do that…"
Again, the technical aspects of the situation are somewhat clear. It is possible Apple could disable the security features as ordered. How they would go about doing so is still being debated online and between security professionals.
Looking ahead though, there are other things to consider. If Apple were forced to do what the court ordered, would that mean the courts could order other device manufacturers or software developers to do the same in the future? Does compliance mean Apple will have to develop bypasses for future iOS devices?
"That's precisely why we at EFF are concerned and will be supporting Apple in this case. The FBI has brought this case to create a precedent that they can order a developer to insert a backdoor," said Nick Cardozo, EFF Staff Attorney, when asked for his thoughts on the case.
"If the FBI wins here, and Apple is forced to subvert its security infrastructure to sign what amounts to a malicious update, then nothing will prevent the FBI from obtaining orders requiring other developers to insert arbitrary code in their products," Cardozo added.
"Worse still, nothing will prevent foreign governments--including regimes such as China that have dubious human rights records--from demanding the same capabilities. In other words, the FBI is asking a court to endorse a legal theory that could be used to compel backdoors in essentially anything."