Apple’s Fight: The Battle w/ Personal Privacy & National Security

The FBI wants Apple’s help to investigate a terrorist attack. Apple says providing this help is the real danger. We’ve reached a boiling point in the battle between tech companies and the government over encryption. And what happens will affect anyone who uses a smartphone, including you.

After the San Bernardino shootings, the FBI seized the iPhone used by shooter Syed Rizwan Farook. The FBI has a warrant to search the phone’s contents, and because it was Farook’s work phone, the FBI also has permission from the shooter’s employer, the San Bernardino County Department of Public Health, to search the device. Legally, the FBI can and should search this phone. That’s not up for debate. If the FBI gets a warrant to search a house and the people who own it say okay, there’s no ambiguity about whether it can search the house.

But if the FBI comes across a safe in that house, the warrant and permission do not mean it can force the company that manufactures the safe to create a special tool for opening its safes, especially a tool that would make other safes completely useless as secure storage. That’s the situation that Apple’s dealing with here.

Don’t sit there chuckling if you use an Android, by the way. If Apple is compelled to create this malware, it will affect anyone who uses technology to communicate, to bank, to shop, to do pretty much anything. The legal basis for requesting this assistance is the All Writs Act of 1789, an 18th century law that is becoming a favorite for government agencies trying to get tech companies to turn over user data. The AWA is not really as obscure as Apple suggests, but it is a very broad statute that allows courts established by Congress to “issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.”

The Department of Justice has even tried to use it to force Apple to turn over suspects’ messages before. I know 18th century law sounds boring, but this is an 18th century law that could fuck you big time.

The All Writs Act can only force a company to do something if it’s not an “undue burden.” Seems like making Apple create malware that will fundamentally undermine its core security features is an enormous burden. And if it’s not deemed “undue” in this case, that sets a horrible precedent. After all, if compelling Apple to maim itself is allowed, compelling Google and Facebook and Microsoft to write security backdoors would also be allowed.

The battle between personal privacy and information gathering as it relates to Apple and security has been building up for years now, and the government narrowing it down to one specific iPhone used by a terrorist in the U.S. has caused the debate to reach new levels. This may be Apple’s battle to lose, but it will be a very public one nonetheless.

Since Apple’s response to the FBI and court order, the White House has stood by the Department of Justice and argued that it’s not about a backdoor for all devices but just a single device, which Tim Cook’s argument already addressed.

Tim Cook’s open letter is on Apple’s homepage and headlines about the government’s demands are all over the news. From my view, Apple customers seem to be overwhelmingly in favor of Tim Cook’s position, while presidential candidates are unsurprisingly siding with the FBI. Where do you weigh in? Here’s what we know so far.

In Cook’s words, this is how he describes the government’s request:

The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.

Cook’s letter seems to acknowledge that it’s technically possible for Apple to comply, but that it has zero interest in doing something that it considers dangerous for all customers. One security firm has also shown that it appears possible.

In practice, what this would look like is Apple creating a new version of iOS without limitations on how many times you can guess a passcode before it locks up for a period of time. That limitation alone is currently preventing the FBI from just trying every possible passcode as quickly as possible.

iPhones with passcode locks are currently disabled after multiple failed attempts to guess passwords. Try too many incorrect passcodes on an iPhone and you’re temporarily only allowed to place emergency calls for 1 minute. Try again after that and it extends to 5 minutes, then 15 minutes, and so on.

Optionally, iPhones can be set to erase all data after just 10 failed attempts.

Because the FBI wants access to text messages, notes, photos, emails, and anything else saved on the iPhone in question, preserving the data is critical for the investigation. The FBI argues that data protected behind the iPhone passcode could offer critical evidence as to how the attack in San Bernardino was planned, who else was involved, and if any future attacks can be prevented.

The court order in this case only applies to this specific iPhone, too, but Tim Cook is right to argue it would set a precedent that would be used in future cases. The FBI has been deploring iPhone encryption publicly long before the shooting in December.

One key takeaway for me is that wow, who knew our passcodes on our iPhones were actually so secure?

Apple under Tim Cook’s leadership has been pitching privacy as a product for years now, starting largely in 2013 with the iPhone 5s and Touch ID. If iPhones were going to be storing fingerprints, Apple needed to promise customers it was safe. Same thing with Health and HealthKit, Apple Pay, and many other new features and services. And the NSA/PRISM surveillance episode only strengthened Apple’s need to take its current position.

Now we find ourselves in the midst of an ongoing national debate over what’s more important: personal privacy or national security? The San Bernardino iPhone is being used to make the argument very specific, but make no mistake that it’s about a much larger divide that’s been developing for years now. Should Apple maintain it’s rock solid position on encryption, or should it comply with the court order and FBI’s request?

My suspicion is that most readers haven’t changed their position, but I am curious how many believe Apple should comply with the FBI in this specific instance.

Elision Magazine