The FBI has demanded that all new homes built starting this year have a rear entrance with a special lock. The homeowner will have the key… but so will the FBI. They state that homeowners can be confident of this key not getting stolen or copied by nefarious criminals.
Wait - what? Is this true? Can they really get away with this in America? No, they can't. Relax, they didn't really ask and this isn't happening. It almost did, though, but the key wasn't to your home. It was, however, to the one item that probably contains as much personal information as a thief could find in your home: your smartphone. And the FBI wasn't asking the home builders, it was asking Apple.
Raise your hand if you remember the huge controversy following the terrorist attack in San Bernardino just a year ago when the FBI demanded that Apple create a backdoor to its iPhone software in order to facilitate its investigation. You might vaguely recall that after the horrific attack many conservatives and liberals were aligned in rare agreement about the need for Apple to create not only a key to the back door, but the back door itself. The FBI actually wanted Apple to build a parallel operating system that would modify the security measures provided standard on iPhones so that they could, in essence, hack it.
And now it's happening again. Following the frightening attack in a Minnesota shopping mall where 10 people were stabbed (the attacker was killed by an off-duty law enforcement officer) the FBI became involved when ISIS claimed credit for the attack. Ditto for the attack at the Seattle area mall where five people were killed. Sadly other attacks have occurred and are likely to continue, putting the authorities in a position of needing as much intelligence as possible to prevent them from proliferating. The logical question of how to fight this menace with better intelligence has driven the discussion back to the same place: Should the authorities be given access, through a back door or otherwise, to someone's encrypted information?
It's leading to another collision between the FBI and Apple (and ultimately could bring Google into the picture if phones discovered belonging to attackers use the Android operating system). You could view it as a collision between privacy and security, both of which are essential components of liberty, making it legally complex and subjectively arguable.
But the fact is that technology has overtaken the arguments. Regardless of the legal outcome of any actions against Apple it simply won't matter.
You see, the way that Apple's devices work is that they allow 10 attempts to enter the correct lock code and then the data on the phone is deleted. Permanently. In their drive to learn more about the terrorists that perpetrated the attack they demanded that Apple give them a workaround as they were, logically, uncertain about whether they could crack the code on the phone they found in the terrorist's home using the available 10 attempts before the data disappeared.
Privacy advocates lined up against national security advocates who wanted Apple to compromise their customers' privacy - even if it was just one customer - in order to expedite the investigation. Arguments ensued. Radio talk shows expounded. Online protests occurred. And then… crickets. Nothing. The hubbub diminished quickly after the FBI apparently dialed 1-800-HIRE-A-HACKER and paid someone $1,000,000 to hack the iPhone. Whether they ever got anything worthwhile is unknown. And the debate? Postponed until it became headlines again.
[ MORE ON CSO: FBI, Apple battle may leave lasting legacy ]
(Side note: Maybe the FBI should have done a little shopping first. Hackers are able to crack into the iPhone with a $100 investment and a couple of days of attempting the 9999 four-digit code combinations. Someone proved that theory recently.)
The real issue here is: does this matter anymore? What the FBI said they were after most of all were text messages and records of phone calls. But anyone who has ever used WhatsApp, Viber or Apple's own iMessage can attest that there are dozens of apps available for iOS and Android that encrypt messages, making them impossible to read without the decryption key. Just during the past few weeks, in fact, the mother of all messaging apps, Facebook Messenger, with more than 900 million users, was just modified to allow encrypted conversations.
With tools like this available to anyone what difference would it make even if the FBI could win the right to force manufacturers to provide them with a back door? Is there even a point anymore of getting in the back door?
Political considerations aside, forcing Apple to provide a back door to their security measures is dangerous and futile. It's dangerous because if there's a way get around the security, even if it is designed with the best intentions, it will either be leaked or figured out. Hackers really are that good. And it's futile because anyone who wants to keep their activities secret will use one of the aforementioned apps to communicate with their fellow conspirators thus defeating the benefit of having a back door in the first place.
As someone who lost two friends on two planes and whose mother was evacuated from a building one block from the destroyed World Trade Center towers during 9/11 you'd think I'd want to do everything possible to stop or capture terrorists. And I do. But a back door to software or phone operating systems isn't the answer. Just as a master to everyone's home would eventually reach criminal hands so would a software key - and that's not good for anyone's security.
In short, you can't improve security by making people's phones less secure.
This article is published as part of the IDG Contributor Network. Want to Join?