J003-Content-#nobackdoor_SQIf you believe Hollywood, or just about any modern crime or spy fiction, hacking our cellphones and computers is trivially easy for the resident tech guy on the team. He or she just needs the right connector, and a few keystrokes, and they’re in. All of the secrets stored on the device are out in the open and at their disposal. From a law enforcement perspective, “if only it were that easy!” From a privacy and security perspective, “thank goodness it’s not.” From a paranoiac perspective, “what if it is?” To Apple, the FBI, and to many others, the question of just what can and cannot be done is highly relevant at this time.

In December 2015, husband and wife Syed Farook and Tashfeen Malik went on a shooting spree in San Bernardino, California killing fourteen and seriously injuring another twenty-two. While both perpetrators were later killed in a shoot-out with police, the investigation is ongoing as a terrorist act on US soil.

The FBI currently has possession of the Apple iPhone belonging to Farook’s employer and issued to Farook, and is trying to gain access to the data stored on the device. This has apparently been a comedy of errors. The FBI directed Farook’s employer to reset the password on the iCloud account so they could access backups, but they were more than a month old and did not have what the FBI hoped for.

As the device’s onboard storage is encrypted, and apparently the FBI doesn’t have the magic box needed to decrypt it, they first asked Apple to assist with gaining access to the encrypted storage on the device. Apple declined, citing privacy concerns for their customers. The FBI applied to a Federal Court to use a little known and relatively ancient law known as the All Writs Act of 1789 (no, that’s not a typo, it really is over 200 years old!) to compel Apple to assist the FBI with their investigation, in at least part by creating a way to input passcodes electronically, so that they can brute-force the PIN code rapidly rather than paying someone to manually type in each possible value one at a time.

Tim Cook, the CEO of Apple, stated “The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand. This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.” Apple currently has until 2016-02-26 to respond to the writ, and has hired outside counsel to appeal the order.

The All Writs Act of 1789 authorizes United States federal courts to “issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law” and that’s where this is going to get ugly in a hurry. On the one hand, it seems reasonable to empower a court to issue orders in the pursuit of justice and in support of investigations, but the “agreeable to the usages and principles of law” is where this gets really complicated. Agreeable to whom? If to the court that issues the writ, that is in essence a blank check to do with as the judge sees fit. If not, then who oversees such writs and ensures they do not represent an abuse of power or an egregious violation of Constitutional protections?

On the one side, this heinous act of terrorism seems to justify any and all actions necessary to ensure that either the perpetrators worked alone, or to identify anyone who may have aided them or is planning to do similar things. Why would Apple not fall over themselves to create a backdoor into their product to help with this investigation and to protect the American people?

On the other side, this seems to be an outrageous overstep by the government, who when faced with an inability to hack a device, see fit to compel a business to break their own product on no more than a fishing expedition, which even if possible, would greatly impact their customers’ trust of those products.

The government thinks Apple can create a custom version of iOS, install it onto Farook’s iPhone, decrypt the data, and give them what they want. They are even willing to let Apple do this in their own labs and to destroy the backdoor version of iOS as soon as the phone’s data has been downloaded. That seems noble and reasonable, but also somewhat naïve. Once that genie is out of the bottle, there will be no putting it back in. This also assumes that what the government wants is both technically possible and that it can be done within reasonable time and costs. Creating a new iOS and installing it onto the device might overwrite or destroy the very data the government wants. If it does not, if the data is encrypted using a key, installing a new iOS won’t decrypt the data. For this to work at all, they must have access to the key used to encrypt the data. And that raises an interesting question. How is a new iOS going to give the government the data?

In any event, it has been confirmed that Farook turned off iCloud backup more than a month before the attacks. It’s not unreasonable to assume that he also turned off location services, so unless he was stupid enough to make his plans with email and SMS, and took selfies with his co-conspirators, it’s doubtful that the iPhone in question will actually have anything to help investigators that they don’t already have from the cellular provider. In a world where even tweens know what “burner phones” are, it’s hard to believe that the perpetrators got into the country, lived for more than a year, acquired assault weapons and ammo, and carried out a brutal attack, but kept details on their phone that could provide investigators with information they don’t already have.

But here’s where things stand now. Apple has until February 26th to respond, but this is a precedent setting event that will probably spend years in court and have far reaching implications across both the smartphone and encryption industries. Whether the government can compel a company to backdoor the security of their products is, to me, secondary to the ability to get at encrypted data by installing a backdoor after the fact. Either data is encrypted and the security of the data is based on the key material being kept both secret and separate from the encrypted data, or the key and the data are stored together and then your four-digit PIN and the time it takes to enter 9999 combinations is all that is keeping your data safe.

Keep an eye on this case. It could be just as important to civil liberties, privacy, and security as any in the past twenty years.

What do you think? Should the government be able to compel Apple to backdoor the phone? Do you think that there will be anything useful to be found in the phone that hasn’t already been discovered? Leave a comment and share your thoughts!