Last week, Apple announced that it is closing a serious security vulnerability in
the iPhone. It used to be that the phone's encryption only protected a small
amount of the data, and Apple had the ability to bypass security on the rest of
it.
From now on, all the phone's data is protected. It can no longer be
accessed by criminals, governments, or rogue employees. Access to it can no
longer be demanded by totalitarian governments. A user's iPhone data is now more secure.
To hear US law enforcement respond, you'd think Apple's move
heralded an unstoppable crime wave. See, the FBI had been using that
vulnerability to get into people's iPhones. In the words of cyberlaw professor Orin Kerr, "How
is the public interest served by a policy that only thwarts lawful search
warrants?"
Ah, but that's the thing: You can't build a backdoor that only the good guys can walk through.
Encryption protects against cybercriminals, industrial competitors, the Chinese
secret police and the FBI. You're either vulnerable to eavesdropping by any of
them, or you're secure from eavesdropping from all of them.
Backdoor access built for the good guys is routinely used by the bad guys.
In 2005, some unknown group surreptitiously used the
lawful-intercept capabilities built into the Greek cell phone system. Thesame thing happened in Italy in 2006.
In 2010, Chinese hackers subverted an intercept system
Google had put into Gmail to comply with US government surveillance requests.
Back doors in our cell phone system are currently being exploited by the FBI and unknown others.
This doesn't stop the FBI and Justice Department from pumping up the fear.
Attorney General Eric Holder threatened us with kidnappers and sexual predators.
The former head of the FBI's criminal investigative division went even further, conjuring up kidnappers who are also
sexual predators. And, of course, terrorists.
FBI Director James Comey claimed that Apple's move allows
people to "place themselves beyond the law" and also invoked that now overworked "child
kidnapper." John J. Escalante, chief of detectives for the Chicago police
department now holds the title of most hysterical: "Apple
will become the phone of choice for the pedophile."
It's all bluster. Of the 3,576 major offenses for which warrants were granted for communications
interception in 2013, exactly one involved kidnapping. And, more importantly,
there's no evidence that encryption hampers criminal investigations in any
serious way. In 2013, encryption foiled the policenine times, up from four in 2012 --
and the investigations proceeded in some other way.
This is why the FBI's scare stories tend to wither after public scrutiny. A
former FBI assistant directorwrote about a kidnapped man who would never have
been found without the ability of the FBI to decrypt an iPhone, only to retract the point hours later because it wasn't
true.
We've seen this game before. During the crypto wars of the 1990s, FBI Director Louis Freeh
and others would repeatedly use the example of mobster John Gotti
to illustrate why the ability to tap telephones was so vital. But the Gotti
evidence was collected using a room bug, not a telephone tap. And those same
scary criminal tropes were trotted out then, too. Back then we called them the Four Horsemen of the Infocalypse: pedophiles,
kidnappers, drug dealers, and terrorists. Nothing has changed.
Strong encryption has been around for years. Both Apple's FileVault and
Microsoft's BitLockerencrypt the data on computer hard drives. PGP encrypts e-mail. Off-the-Record encrypts
chat sessions. HTTPS Everywhere encrypts your browsing. Android phones already come with encryption
built-in. There are literally thousands of encryption products without back
doors for sale, and some have been around for decades. Even if the US bans the
stuff, foreign companies will corner the market because many of us have legitimate
needs for security.
Law enforcement has been complaining about "going dark" for
decades now. In the 1990s, they convinced Congress to pass a law requiring phone companies to
ensure that phone calls would remain tappable even as they became digital. They
tried and failed to ban strong encryption and mandate back doors for their use. The FBI tried and failed again to ban strong
encryption in 2010. Now, in the post-Snowden era, they're about to try again.
We need to fight this. Strong encryption protects us from a panoply of threats. It protects
us from hackers and criminals. It protects our businesses from competitors and
foreign spies. It protects people in totalitarian governments from arrest and
detention. This isn't just me talking: The FBI also recommends you encrypt your data for
security.
As for law enforcement? The recent decades have given them an unprecedented
ability to put us under surveillance and access our data. Our cell phones
provide them with a detailed history of our movements. Our call records, e-mail
history, buddy lists, and Facebook pages tell them who we associate with. The
hundreds of companies that track us on the Internet tell them what we're
thinking about. Ubiquitous cameras capture our faces everywhere. And most of us
back up our iPhone data on iCloud, which the FBI can still get a warrant for.
It truly is the golden age of surveillance.
After considering the issue, Orin Kerr rethought his position, looking at this in terms of a
technological-legal trade-off. I think he's right.
Given everything that has made it easier for governments and others to
intrude on our private lives, we need both technological security and legal restrictions to restore the traditional balance
between government access and our security/privacy. More companies should
follow Apple's lead and make encryption the easy-to-use default. And let's wait
for some actual evidence of harm before we acquiesce to police demands for
reduced security.
This essay previously appeared on CNN.com
EDITED TO ADD (10/6): Three more essays worth reading. As is this on all the other ways Apple and the
government have to get at your iPhone data.
And an Washington Post editorial manages to say this:
How to resolve this? A police "back door" for all smartphones is
undesirable--a back door can and will be exploited by bad guys, too. However,
with all their wizardry, perhaps Apple and Google could invent a kind of secure
golden key they would retain and use only when a court has approved a search
warrant.
Because a "secure golden key" is completely different from a
"back door."
Source: schneier.com
No comments:
Post a Comment