PoisonedAppleSometimes it seems as if the term “secure software” is an oxymoron. With millions of lines of code, it’s impossible to build a sophisticated computer program – much less one as complex as an operating system – without some vulnerabilities. For the most part, software vendors try diligently to patch those problems as quickly as possible after they come to light.

In the security field, we’ve grown accustomed to watching the mad scramble that generally follows a familiar format: First the news breaks that some security researcher has discovered a security hole that attackers could take advantage of, then the vendor announces that a patch is in the works (and maybe issues some intermittent workarounds that can mitigate the threat until it’s issued), then after testing to ensure that the cure isn’t worse than the disease, it’s released and users and IT admins are urged to apply it immediately.

Sometimes vendors seem to drag their heels on getting a particular problem patched. Microsoft was severely criticized a couple of months ago for having failed to patch a vulnerability in Internet Explorer 8 that had been discovered in October 2013, more than half a year earlier. But Apple is now facing even more serious accusations: that the company has intentionally built exploitable vulnerabilities into their iOS mobile operating system, to serve as a “back door” that would enable law enforcement agencies (presumably with the proper warrants) to surreptitiously glean certain user data from the devices.

This was brought to light by a security researcher named Jonathon Zdziarski in a paper titled Identifying Back Doors, Attack Points, and Surveillance Mechanisms in iOS Devices. Zdziarski is a forensic scientist and author of iPhone Forensics and a number of other books about hacking and securing iOS and iOS apps, who has the creds to make such allegations.

Of course, we already knew that wireless phone carriers were routinely giving records of phone calls and location information to federal, state and local law enforcement agencies. Last year mainstream news stories reported that the big cellular companies had confirmed this, noting that they receive many “emergency” requests from police that don’t require a warrant or court order. However, the information that Apple can provide to law enforcement via their “back doors” is much more intrusive. It includes not only the users’ call history, but also SMS messages, photos, videos, audio recordings, and contacts.

The good news is that while Apple can access the data in the default iOS apps, the company doesn’t appear to have access to information that’s stored in third party apps or data that has been encrypted. Unfortunately, iPhones and iPads don’t allow users to insert and store data on microSD cards as most Android and some Windows phones do, which would make it easier for users to protect their data since the cards can easily be encrypted and can also be removed from the device.

The bad news is that as with any means of ingress, whether it’s a physical entrance in a building or a coding “hole” in software, the builders/owners (Apple employees in this case) aren’t the only ones who can sneak through it. Third parties who have the correct tools can collect much more data, above and beyond what Apple would turn over to law enforcement – including files that the user had supposedly deleted.

Zdziarski leaves little doubt that these back doors are there, lurking deep in the innards of the millions of iOS devices being carried around every day by unsuspecting consumers and business users. But there’s a saying, the gist of which is that if the cause of something could be attributed to either incompetence or malevolence, it’s more likely to be the former than the latter.

Is it possible, then, that these vulnerabilities are accidental, something that Apple wasn’t aware of? Zdziarski doesn’t think so. He says Apple has been not just maintaining but enhancing the code that makes the intrusion possible – although he also emphasizes that he isn’t accusing Apple of doing so for malicious purposes. It’s those third parties, governmental or otherwise, that he’s more worried about. In his blog post on July 23, he notes Apple’s response to the reporting of the back doors, which it couches in terms of “diagnostic capabilities.”

As a former police officer, I completely understand the need for law enforcement agencies to have effective investigative tools to help prevent crime. In a post-September 11 world, most people would agree that there are circumstances in which some invasion of privacy is justified to avert criminal activities that can have catastrophic consequences. As a very active participant in this digital age, I also understand the ease with which such tools can be abused.

Apple is by no means unique in cooperating with law enforcement and realistically speaking, it’s likely that the vendors of many or most popular software programs have similar hidden capabilities. The wide distribution of iOS products accompanied by the belief of many users that their Apple products are secure combine to make this issue worth the media attention it’s getting.