Linked by Thom Holwerda on Sat 19th Jul 2014 19:06 UTC

Jonathan Zdziarski's paper about backdoors, attack points and surveillance mechanisms built into iOS is quite, quite interesting.

recent revelations exposed the use (or abuse) of operating system features in the surveillance of targeted individuals by the National Security Agency (NSA), of whom some subjects appear to be American citizens. This paper identifies the most probable techniques that were used, based on the descriptions provided by the media, and today’s possible techniques that could be exploited in the future, based on what may be back doors, bypass switches, general weaknesses, or surveillance mechanisms intended for enterprise use in current release versions of iOS. More importantly, I will identify several services and mechanisms that can be abused by a government agency or malicious party to extract intelligence on a subject, including services that may in fact be back doors introduced by the manufacturer. A number of techniques will also be examined in order to harden the operating system against attempted espionage, including counter-forensics techniques.

This paper is actually half a year old - give or take - but it's gotten a lot of attention recently due to, well, the fact that he has uploaded a PowerPoint from a talk about these matters, which is obviously a little bit more accessible than a proper scientific journal article.

For instance, despite Apple's claims of not being able to read your encrypted iMessages, there's this:

In October 2013, Quarkslab exposed design flaws in Apple's iMessage protocol demonstrating that Apple does, despite its vehement denial, have the technical capability to intercept private iMessage traffic if they so desired, or were coerced to under a court order. The iMessage protocol is touted to use end-to-end encryption, however Quarkslab revealed in their research that the asymmetric keys generated to perform this encryption are exchanged through key directory servers centrally managed by Apple, which allow for substitute keys to be injected to allow eavesdropping to be performed. Similarly, the group revealed that certificate pinning, a very common and easy-to-implement certificate chain security mechanism, was not implemented in iMessage, potentially allowing malicious parties to perform MiTM attacks against iMessage in the same fashion.

There are also several services in iOS that facilitate organisations like the NSA, yet these features have no reason to be there. They are not referenced by any (known) Apple software, do not require developer mode (so they're not debugging tools or anything), and are available on every single iOS device.

One example of these services is a packet sniffer,, which "dumps network traffic and HTTP request/response data traveling into and out of the device" and "can be targeted via WiFi for remote monitoring". It runs on every iOS device. Then there's, which "completely bypasses Apple’s backup encryption for end-user security", "has evolved considerably, even in iOS 7, to expose much personal data", and is "very intentionally placed and intended to dump data from the device by request".

This second one, especially, only gave relatively limited access in iOS 2.x, but in iOS 7 has grown to give access to pretty much everything, down to "a complete metadata disk sparseimage of the iOS file system, sans actual content", meaning time stamps, file names, names of all installed applications and their documents, configured email accounts, and lot more. As you can see, the exposed information goes quite deep.

Apple is a company that continuously claims it cares about security and your privacy, but yet they actively make it easy to get to all your personal data. There's a massive contradiction between Apple's marketing fluff on the one hand, and the reality of the access iOS provides to your personal data on the other - down to outright lies about Apple not being able to read your iMessages.

Those of us who aren't corporate cheerleaders are not surprised by this in the slightest - Apple, Microsoft, Google, they're all the same - but I still encounter people online every day who seem to believe the marketing nonsense Apple puts out. People, it doesn't get much clearer than this: Apple does not care about your privacy any more or less than its competitors.

Thread beginning with comment 592817
To view parent comment, click here.
To read all comments associated with this story, please click here.
Member since:


I think OP was referring to the companies that hold our private data (be it google/apple/ms/lavabit/etc) and are told they must release our records/files/contacts/etc to the NSA. We're vulnerable to this kind of spying regardless of device, and source code licenses don't really factor in.

I doubt the NSA is "oppressing" ordinary software developers who merely distribute software without possessing troves of user information. Any evidence of this would be informative though.

You are correct. I was mistaken, and I should have read more carefully the OP of this sub-thread.

However, the original article of this thread clearly points to vulnerabilities exploitable in IOS devices, which (last time I checked) have hardware and software that is completely closed and proprietary.

Reply Parent Score: 2

Alfman Member since:


However, the original article of this thread clearly points to vulnerabilities exploitable in IOS devices, which (last time I checked) have hardware and software that is completely closed and proprietary.

Yes, and I find it appalling that it's so difficult to find 100% open source versions of these things. However, it's not clear to me how much open source improves the security situation.

I'm not aware of any research showing that the rate of accidental vulnerabilities is better or worse in FOSS code than in proprietary code, I assume it's similar. As far as intentional vulnerabilities go, those can be added to the product, yet be omitted from the public source. It's not trivial to prove our devices are actually running the public sources, and virtually impossible if, say, a base-band processor's firmware is not user accessible.

So being open source is important but not really sufficient, IMHO. You'd actually need to compile and flash the binary code yourself to be highly confident that the software hasn't been tampered with.

A related, yet different problem is knowing that your device hasn't been compromised by means of an "interdiction":

That should be scary stuff even for a seasoned IT professional. Realistically an average user does not have much of a chance at protecting their privacy if governments/vendors/service providers are determined (or coerced) to spy on them.

Edited 2014-07-21 10:32 UTC

Reply Parent Score: 3

leos Member since:

However, the original article of this thread clearly points to vulnerabilities exploitable in IOS devices, which (last time I checked) have hardware and software that is completely closed and proprietary.

Actually they just talk about theoretical vulnerabilities. Also the guy says that iOS security is overall great. Of course Thom didn't mention that in order to make the article more controversial.

Reply Parent Score: 2