Following the letter from Tim Cook, Apple has now published a set of questions and answers regarding the case of the FBI demanding, via a court order, that Apple create a backdoor into iOS for the FBI to use. Overall, I find the questions and answers a strong showing by Apple, but two parts really stood out to me.
First, the FBI is apparently a little bit incompetent.
One of the strongest suggestions we offered was that they pair the phone to a previously joined network, which would allow them to back up the phone and get the data they are now asking for. Unfortunately, we learned that while the attacker’s iPhone was in FBI custody the Apple ID password associated with the phone was changed. Changing this password meant the phone could no longer access iCloud services.
This is incredibly cringe-worthy. The agency now asking to weaken the security and harm the rights of all iOS users, is the same agency who made beginner mistakes such as this one. If you are a true cynical, which I am, you might think the FBI changed the password on purpose in order to force this case.
The second part that really stood out to me is also by far the weakest part: Apple seems to be contradicting itself regarding the question whether or not it unlocked iPhones for law enforcement in the past.
Has Apple unlocked iPhones for law enforcement in the past?
No.
We regularly receive law enforcement requests for information about our customers and their Apple devices. In fact, we have a dedicated team that responds to these requests 24/7. We also provide guidelines on our website for law enforcement agencies so they know exactly what we are able to access and what legal authority we need to see before we can help them.
For devices running the iPhone operating systems prior to iOS 8 and under a lawful court order, we have extracted data from an iPhone.
Emphasis mine.
So, did Apple unlock iPhones in the past, or not? This is a pretty glaring contradiction, and it makes me feel uneasy about Apple’s motives and past and present roles in this case. As with any corporation, of course, Apple is beholden to its shareholders, and if this stance starts to lead to political – and thus, financial – headwinds, shareholders will pipe up, forcing Apple to give in. This contradiction only strengthens this fear for me.
Where is the contradiction? Old iPhones had no encryption so Apple simply dumped memory on request. That’s a bit different than creating backdoors.
Agreed, that was my reaction to the summary too. There isn’t any contradiction. Extracting unencrypted data from a phone, which can be done easily, is not the same as unlocking an encrypted phone for law enforcement.
The question answered was: Has Apple unlocked iPhones for law enforcement in the past? First they say “no”, and then they say “yeah we did”. What is it?
And the answer was: no we have not unlocked phones, but we have dumped the internal storage, what is so complicated to comprehend here? Opening the device and putting an external reader on the storage device is pretty trivial, like hooking up an HDD from one computer into another to read the data on it….
The whole point here is about _encryption_, and on pre-iOS 8, there was no such thing, so just reading the storage device and dumping its data was enough (and possible for anyone with the appropriate hardware).
Exactly. This TechCrunch article explains in more detail: http://techcrunch.com/2016/02/18/no-apple-has-not-unlocked-70-iphon…
The difficulty lies in the meaning of “Locked”. If I put my dead tree files in a fire proof safe and lock it, the most obvious way to get that information is by … unlocking it. If those dead tree files can be removed, while keeping the lock on the safe, well that’s a terrible design isn’t it? So if someone tells you they removed some files from your locked safe, the most reasonable assumption is that you unlocked them.
So I totally understand Thom’s misunderstanding. IOS’s lock screen was basically a joke for a long time.
And plus its an absurd statement by Apple. Who cares how you gave the feds the data? It was legal and possible, why be so obtuse? Further more, its legal and possible now, so what gives?
Even more understandable in Thom’s case because English isn’t his native language.
It seems to me that Apple is trying to get house points for protecting consumer privacy AND complying with court orders (whenever possible).
You seem to confuse unlocking a phone with getting data from it. Unlocking a phone gives you complete access to it. Getting data from it means dumping the filesystem which for a non-encrypted phone (probably pre A7/iOS8 devices) is a trivial NAND dump.
Once you get to A7 devices (touchID), you have a security enclave (Apple Talk for a TPM-like device) that stores the master filesystem encryption keys and is accessible by fingerprint and/or passcode and is outside the iOS (different firmware, different silicon die). It works like a TPM, after authentication, you ask it to decrypt a part of the filesystem (the header which contains the encryption keys for the rest of the filesystem) and it will decrypt it without providing you the actual master encryption key. From there you get the next set of encryption keys which you actually use to decrypt the data on the flash storage.
If you fail any step you don’t get the decrypted data. You can overwrite it, but you can’t decrypt it.
Summary: the first answer states that they haven’t unlocked any phones (no passcode/fingerprint by-pass) to get to the data.
The second answer says that they dumped the filesystem on some devices (say iOS5 iPhone 3GS) which was not encrypted.
There is no contradiction, there are different answers to different questions.
This is one of the reasons I don’t want our government having any level of access to my data. If they really made beginner mistakes like this with no intent to force the issue, then I’d be hesitant on whether they can even read any data they do get… and I mean read in the literal sense, that is, being able to comprehend the written word.
Sadly, I do find it believable to some degree that they really are this tech incompetent. You only need to look at the people leading our country, and research the crazy stuff they’ve said regarding tech subjects, to know that they really do not get it. One particular presidential wannabe actually made it a point of pride that they didn’t use that internet thing. He was proud of it! I don’t know which would be worse: if he really was proud of it or if he was so out of touch that he thought that’d be a good political angle.
I’m rambling. I guess the short version: I don’t believe they are that incompetent… but I could believe it should it turn out to be the case. It’s not as far-fetched as one would think.
I’m not an iPhone user, so perhaps someone can help me out: why don’t they just change the Apple ID password back to the original value so the phone can then perform the backup?
It is probably powered off. If it was still on it would also be a lot easier to break into. Disk encryption only protects a device once it is powered down. They need Apples help to bypass the pincode check on power up.
Okay, but I meant changing the password at the ‘cloud’ end rather than the device end. From Apple’s Q&A I got the impression the back up occurs in the background (without having to unlock the device with the PIN).
If the devices powered off, the problem remains to turn the device on. Even if the password was not changed that would not change anything. The change of password is not relevant to whether they can access the data or not.
I am guessing the issue is that they theoretically could have kept the phone alive, not changed the password and then have had easier access to synced data.
Edited 2016-02-22 18:33 UTC
This isn’t the impression I get from the Q&A, or Apple’s site: https://support.apple.com/en-gb/HT203516
According to this, the device will backup even if it’s locked (in fact *only* when it’s locked). It just needs power and access to a Wi-Fi network.
In this case, turning the device on, leaving it in range of a Wi-Fi network it’s familiar with (as described in the Q&A) and waiting a day would be enough to trigger the backup.
I think we’re confusing the meaning of being locked. The web site you referred to refers to the screen being locked, i.e. when you press the lock button. The lock we’re referring to is the security features of the Apple ID and the device itself.
Fair enough in that case. So you mean turning on the device from cold presents a lock that must be bypassed before the backup can take place? It’s just that this isn’t mentioned in the Q&A as a problem.
I believe the answer lies in “how” they changed his password in the first place.
Unless they had access to his “original” password, they probably used the “password reset” function in icould.
They just click the “i forgot my password” which will send you an email to your registered email account with instructions to reset your account password. Only thing they would have needed would be access to his registered email account.
Getting back to your original question, why don’t they change the password back to the “original” password…
I believe the simple answer is “they don’t know what the original password was.”
Yes, that answers my original question; thank you. Nonetheless, even without the original password you might think Apple could patch their servers to allow the phone to authenticate anyway (i.e. for this particular phone, with this particular Apple ID, just set the server to allow it to authenticate using any password/token, say).
The phone would then (possibly) perform its backup, which the FBI could then collect from the server.
The FBI already has the backups for up to 19th October, suggesting that once on the server they can access it without needing the password.
“If” the iphone in question is still setup to automatically backup to icloud, and “if” it has already tried to do so since the FBI changed the icloud password, then more than likely the server-side hack would probably no longer work.
I would assume that once the iphone has tried to backup to the clound and received a failed credential response, it would “mark” the credential as “bad” and prompt for user intervention to enter the new password. Which, the FBI would not be able to do because they are locked out of the phone…
That seems to be a sensible question to try to find an answer to then: once an iPhone backup fails to login, does it then give up trying with those credentials? Unfortunately I don’t have an iPhone to test, but if so then I agree this would rule out the approach for the FBI.
Yes, but the backup is of the already encrypted data (it is stored encrypted on the filesystem). They can restore the backups but they also need to decrypt them.
I hope you don’t mind me questioning you on this, but is that from experience? I got the impression from what others had said that the iCloud backup is encrypted separately from the encryption used on the phone. For example, @Carewolf and @darknexus said the phone would only backup if it were in a non-encryption-locked state.
Apple’s support pages state files are backed-up selectively (e.g. photos & videos may be excluded), so it’s not just an entire copy of the encrypted volume.
Also, the FBI have said they already have access to backups up to 19 October but want any newer data, suggesting the backups are of use to them whatever state Apple’s provided them in.
I believe you misunderstood what I was saying, or perhaps I was unclear. I stated that we were getting mixed up by the term “locked”. From my experience, this is what happens:
1. The device is in an encrypted, locked state before identification. This does not apply to devices which do not have the hardware to support this, such as the iPhone 5C. This state of affairs is irrelevant to this case, however in this state, the iPhone will back up to iCloud provided it is connected to power and a known network. Otherwise, backups on these devices would never occur.
2. Locked, unencrypted. See above, as behavior is identical. This applies to devices such as the 5C and below, even with a passcode.
3. Remotely locked. Applies to all devices post iOS 7. In this state a phone has been reported lost or stolen and is security-locked. Only the owner of the Apple ID used to lock the device may unlock it. Backups do not occur and the device is wiped if the owner chooses to do so at the time of the security lock. This lock is server-side and cannot be bypassed even by wiping the operating system.
4. Not authenticated. This state is what has actually happened in this case. The Apple ID password has been invalidated. Backups do not occur and cannot until the iCloud password is reset and entered on to the device. Changing the password back to the old one, even should they be able to do it, is of no help as the keys and tokens used for authentication have been revoked.
All of this is complicated further by the fact that ten failed passcode attempts will wipe the device and invoke stage 3. They also cannot simply connect it to an untrusted computer without the passcode. Perhaps they could take the nand chips out and dump the data, however I do not know if they have enough data about the controller to accomplish this. Given the furor over the case, I have to assume either they are not able to do this or are uncertain of the results and do not want to risk possible damage to the data. We can forget about existing iCloud backups since there aren’t any recent ones, else we’d probably have never heard about this case. The problem here is the idiots at the FBI went and reset the Apple ID password without researching what that would do. If they’d just left it alone they could have brought it in range of a known network and… presto! However, that option is gone now. They got into this via their own incompetence.
Thanks for the very detailed explanation. I certainly didn’t mean to misrepresent you; what I wrote earlier was my genuine interpretation of what was written before.
Interestingly, I actually think they had little choice but to change the password. There may have been no other way to prevent someone else who may have had the password from changing and contaminating the data.
They couldn’t contaminate the data, and the phone might have had time to do one more backup–the important one they would have needed. As for someone else changing the password… what difference did it make? They’re locked out, and would have been no matter who changed the password. They would have discovered this had they done even the tiniest amount of research on Apple’s security mechanisms before jumping the gun, and could have made a more intelligent decision. Instead they got trigger-happy (in the figurative sense anyway) and didn’t ask questions before they shot themselves in the foot. Maybe next time they won’t be so stupid.
Isn’t there some data in the iCloud account that someone could have logged in remotely and changed?
Not really. They might have been able to get at iCloud drive, however not many people use that and it is independent of the device backups. Apple does give options to delete device backups, however the files don’t seem to be instantly deleted as I’ve been able to easily undo a deletion an hour or so later. So they had a choice, and it would only take five minutes of reading Apple’s own information about the security features to know what would happen. I don’t have the slightest bit of sympathy for them. They thought they’d be able to jump the gun and because they’re the big, bad FBI they’d be able to get bailed out of the whole they dug for themselves if anything went wrong. I hope they learn from this and do their research next time, and take a little humility along with it.
What the FBI have asked of Apple is both wrong and a bit crazy in my opinion, so I don’t have sympathy either. However, I can see how they might have got into this situation for valid reasons. The way they’ve responded to it subsequently is a different matter.
I’m not giving Apple much credit either, but if their commercial interests happen to align with customer privacy, then that’s a good thing.
[Edited to try to fix formatting]
Edited 2016-02-25 19:55 UTC
Because the session cookie would be invalidated. The password, plus optional two-factor authentication, is used to establish a connection but is not sent after that. It’s a token exchange, kind of like OAuth. A password change invalidates the token on Apple’s servers, and a new token must then be exchanged and validated regardless of what the new password is.
If the token is invalidated on Apple’s servers, it’s hard to see what they can’t just configure the server to accept whatever the device sends. I appreciate it depends on the exact protocol and client behaviour (which I don’t know).
Edited 2016-02-22 18:01 UTC
It depends on whether they are even able to do that. As you say, we don’t know the exact details of their protocols nor servers. However I’d expect that doing this would open their servers up to attack. I know I’d not do it for that reason alone, never mind the FBI.
It’s probably public-key cryptography related, and the ‘cookie’ is probably 1024 bits or more, and is a _factor_ in a multiple of primes.
For example:
The phone has an OS partition, and a user partition.
The OS partition is unencrypted (or encrypted with a on-device key).
The user filesystem begins as unencrypted.
The device always encrypts the user filesystem prime number A. (A is regenerated every time the user does a factory-reset of their phone.)
If no passcode is set, a second prime number of B is set to 1.
If a passcode is set, the device does a one-time computation of prime number B (randomized), and sends passcode + B to the Apple server. (On systems with TouchID, B is also stored in the TouchID hardware, probably with a hardware self-expiration timer, protected by the fingerprint)
The multiple of A * B is stored on the device (maybe in the header of the filesystem).
If B is stored in the TouchID, it can be retrieved by either the fingerprint, or the correct passcode. Entering the wrong passcode too many times will erase B from the TouchID.
To unlock the phone, (if B is no longer in the TouchID subsystem), the phone sends the passcode to the Apple server. The Apple server responds with B, the phone computes A = (A * B) / B, and decrypts the phone with A.
The reason this works is that (for now, at least), computing the prime factors of large numbers is hard, so A * B is hard to factor into A or B.
A is stored never, nowhere. Only computed from A * B / B
B is stored in the TouchID (fingerprint/passcode protected), or on the Apple server.
What the FBI did by changing the iCloud password was erase the Apple server’s copy of B, so B is only the TouchID.
The FBI wants a way to extract B from the TouchID subsystem. If they get this, they can unlock any iPhone _without_accessing_Apple_servers_.
If the FBI leaks this information, criminals can unlock any iPhone – including those with ApplePay – without accessing Apple’s servers.
Which means, that criminals can steal an iPhone, unlock it, and use the ApplePay information to drain their victim’s bank accounts.
Which Apple has promised to banks “can’t possibly happen”.
This is not a ‘privacy’ or ‘customer relations’ issue for Apple.
This is a future of ApplePay issue.
So… something to fix in the next release?
Apple could be doing this all along. Maybe they’ve already done it in the past via a FISA warrant and just can’t talk about it.
My point is, if you rely on software for security, and that software can be “upgraded” at any time by the manufacturer, it’s a problem. This is the definition of a back door. They could design their OS so that it has to be unlocked to “upgrade”, but they didn’t…
reply
Edited 2016-02-22 18:09 UTC
It may very well be that FBI had very good reasons to reset the iCloud’s account password (e.g. stop other terrorists to access that account?).
Anyway, it is funny to see how this Apple vs FBI case is now about who is more incompetent, especially that in China it is well known that Apple has such a backdoor (even the official media in China has stated this). Apple plays hard ball with FBI but not with China!
Edited 2016-02-22 18:27 UTC
Ah. You’re one of those “terrorists are everywhere” types. Gotcha. And, if they took the time to go screwing with passwords without taking the time to make sure it wouldn’t mess things up, then they deserve to have no access. They fucked up, now they have to face the consequences. Sucks to be brought back down to mortal level doesn’t it?
Please, could you point where I stated that “terrorists are everywhere”?
My point is that the backdoor exists already and China has it but the FBI does not have it!
This is not about who screwed it up! That is irrelevant!
Edited 2016-02-22 19:25 UTC
If you were REALLY cynical (like me) … you could say this whole charrade was put on to give people a bit of a false sense of security wrt cloud services.
There’s something I don’t get.
Apple says:
“One of the strongest suggestions we offered was that they pair the phone to a previously joined network, which would allow them to back up the phone and get the data they are now asking for.”
Does that mean Apple would have given the FBI access to iCloud data belonging to the San Bernandino shooter? Is there no problem with that?
In other words, is there no problem with Apple, Facebook, Google, et al to provide the FBI with access to private data they already have? But we do have a problem with the Apple breaking into a phone it already could break into.
You have to keep that in mind. Apple CAN break into that phone. Apple cannot break into devices with a Secure Enclave (or so says some article I read from a security expert), but Apple CAN break into this particular phone. They can. So I wonder what would happen if it were in Apple’s absolute best interest to break into that phone…
see above for the post detailing why this is about the future of apple pay, and iOS entirely.
if the FBI is allowed to crack into that phone, through the encryption, that means anyone in the world could if the FBI has a leak. government agencies are known to leak, especially something that valuable.
once that secret is out the entire security apparatus of the iOS, touch ID, and apple pay is destroyed. any iphone can be stolen and the person’s bank account drained with that sort of power.
it would be a disaster. apple’s products would take a major hit, apple’s stock would take a major hit, and their differentiation between android is gone. [notice how very few android people are complaining about the FBI or sticking up for apple because most of them don’t have the expectation of security in the first place]. apple sells on that promise of no advertisers, no big brother, no boss/admin, lock it down for yourself. manage it and lock it down if you wish.
these people turned on all the security on their work-supplied iphone – and then committed horrible acts. that doesn’t equal greatly damaging the entire iOS ecosystem – it’s just not worth it for a single crime investigation. if those people used the internet for anything there’s traces of it, they don’t really need that phone.
excellent reporting, Thom. i can be critical, i know, but well done. please continue to follow this story.
the cynic in me says it’s a show too. i don’t know about china, but i doubt that the FBI really needs into that phone to crack the case.
If sentenced by endowed Tribunal. On defense of a single individual. That is a fundamental of Libertarianism. That goes beyond Democracy.
Should be noted that the Bureau is acting on behalf of individuals not able anymore to speak for themselves. That is the nature of the case and should not be hindered by the interests of the Parts at Stake.
I believe I’ve read elsewhere that Apple did give the FBI the most recent copy of the iCloud backup that they had, but it was 6 weeks old. Apple were hoping that, if the iPhone connected to a known WiFi network and was powered, it would perform an automatic iCloud backup (assuming that feature was still enabled, which no-one appears to know for sure). Apple would then be in a position to give the FBI an up-to-date iCloud backup (which would still provide less information than complete access to the phone).
“ALL DATA” on cloud based services whether it is FACEBOOK, GOOGLE, or ICLOUD is subject to seizure by law enforcement agencies with the proper warrant. It may be your “private” data, but if it is on servers they own, they “own” that data, and as such they are required by law to turn over said data upon a judges order.
The issue here is that Apple has been ordered to create a method to by-pass security protocols put in place on devices they do not own. To create a backdoor to data they do not own. TOTALLY different issue
tomoki76jp,
Actually, the government do own the phone and data in question since it was government issued. For better or worse Apple is saying ‘no’ despite authorization from the legitimate owner.
Edited 2016-02-23 03:01 UTC
That makes sense…
I’m seriously a bit unsure as to what side to pick. On the one hand, I understand the issue of privacy and the importance of encryption. On the other hand, I can imagine many scenarios in which a backdoor would make sense. Let’s see how this unravels.
Since there’s a warrant Apple would turn over the data, if they had it. Undoubtedly the phone and data owner would voluntarily turn over the data if they had it. And the FBI has every legal right to the data, if they can get it.
The ‘problem’ is that Apple purposefully created a system in which no one can get the data, not even the legal owner.
I’ll be curious to learn how this all turns out.
The phone and all of the data on it are the property of San Bernadino County. It was not the shooter’s personal phone and it is not the shooter’s private data.
And that portion of the story is moot anyway because the FBI has a search warrant signed by a real live judge with jurisdiction.
The legal argument is whether or not the lower court can force Apple to write new software or create new products to the FBI’s specifications.
In cases of murder, with a dead suspect who cannot be questioned, their iPhone should be allowed to be permanently unlocked as evidence.
This is not an invasion of privacy. The main suspects/perpetrators/alleged killers are dead and cannot be questioned.
Read the issue more carefully. It’s not just about having access to this one phone. It’s about forcing Apple to make a back door into everyone’s phone and that most definitely is an invasion of privacy. If all the FBI were asking is data from this one phone, I don’t think any but the most paranoid would be concerned. It’s the method behind what they’re asking, and what it means for the rest of us, which has us shaking our heads.
darknexus,
A real “backdoor” would be something that apple was forced to install at the FBI’s request onto everyone’s iphone, which would be lurking there waiting for law authorities to use. But that’s not at all what’s going on and is not at all what the FBI requested. They got a court order asking apple for help with automation, that’s it. If we are all honest about it, this is something that an agency or hacking group with hardware hacking capabilities would be able to do without any help from apple. This is not creating a new risk to privacy that isn’t already there.
So why ask apple for help? If recent events are anything to go by, it seems obvious that the FBI are not hackers, unlike say the NSA and GCHQ. It’s also long been known that the government agencies don’t like working together due to politics. Never the less, it’s a good bet that the NSA will have an ANT catalog entry for this if it doesn’t already.
1) Per previous article, apple is technically able to comply with the FBI’s request.
2) If automation does not poss a threat, then apple has no reason not to cooperate with the FBI’s request.
3) That fact that apple is making such a stink about cooperating is an implicit acknowledgement that they are afraid it could work.
I feel like apple is apple going into damage control mode in case the FBI does break into the phone. It does a great job shifting attention away from the fact that apple’s crypto didn’t protect the phone as they claimed it would and towards this idea that if the FBI breaks in it’s because they cheated.
But I say a breach is a breach. Say one of those digital safes we see in hotels could be reset with a good pound from a hammer dislodging the battery long enough to reboot the processor. “oh but you weren’t supposed to do that”. Well obviously not, but it’s exactly how hackers break into things, they *will* cheat.
The real problem here is not some FBI backdoor, it’s weak passwords. I’ve said it before, but this message needs to come from apple themselves: no matter how good the crypto is, it can’t compensate for weak authentication methods (ie simple passwords).
Edited 2016-02-24 16:17 UTC
They’re asking that Apple create a custom iOS that can be installed on any phone in their possession, not that Apple unlock the data on this one phone. Again, read more carefully. If they’d asked to simply have the data from this one phone, I doubt anyone would even be fighting them. We’d never even have heard about this case, for that matter.
darknexus,
Note specifically that none of these three requirements requires or implies that apple must include a mechanism for copying the SIF from RAM. It seems to me that apple can technically comply fully with this court order without giving the FBI tools to break into other phones.
If you still think I’m wrong, then please quote the exact piece you are referring to and the exact process by which the FBI would obtain these tools such that apple couldn’t retain control. Fair enough?
I’m a strong advocate of consumer privacy, and I fully agree with others who say this would establish a precedent – we need to have that debate. But I think there’s a lot of misinformation being spread and it’s not helping the cause.
Edited 2016-02-24 18:24 UTC
Certainly Apple or anyone else, could build a consumer OS they’re unable to access, without asking for -also unethical- help down the hard stack.
Why would them? OS’s -only by their huge dimension- are imperfect. Such a vital inversion is going to be hacked, sooner or later.
In order to oversee, repair and protect that infrastructure from malware at the very bottom of the stack, a door has to exist there.
Mere mention of a lack of, is a distractive from the main issue, which is the legislative limbo.
dionicio,
Ideally this process would be done in the open, where updates could be audited. The fact that one company holds the keys to hundreds of millions of devices troubles me. Even if their intentions were 100% innocent, which is frankly more credit than any of them deserve, I still think the single point of failure and control makes it a bad idea. This is technically solvable, but it seems unlikely that any of the big three (google, apple, microsoft) would be willing to relinquish control over our devices for the public good. Once again it boils down to greed.
Are you saying Phloptical, that once I’m gone, I have no right to take with me my personal doubts, my no so flattering photos, my recorded karaoke at that party nobody would like to remember, my dampened screams to my most beloved friends, annotated in my personal notes?
Their Smart-Phone is their Personal Computer.
“He called for a healthy debate on the issue and said that we should strike a balance between privacy rights and legitimate security concerns. Gates also noted that the government has historically abused its powers, citing the case of former FBI director J. Edgar Hoover.”
http://www.fastcompany.com/3057046/fast-feed/bill-gates-sides-with-…
“”The intent of this [digital] constitution is to help guide policy creation, broker compromise and serve as the foundation for decision making around cyber security issues,”
Digital is an environment so alien to actual legislative perception of the world, that this is crying must.
Just as a footnote: The members list is still far, far away of that touted ‘middle’ ground.
http://www.fastcompany.com/3057065/amid-apples-feud-with-the-govern…
Hopeful somebody finds a referral to this note by Steven Melendez, at some Slower Company.
“The result was years of expansion of RIPA powers, to the point where powers originally intended for the intelligence services were delegated to over four hundred public bodies. Even the head of MI5, Lady Manningham-Buller, who lobbied for the RIPA powers, was shocked by the eventual overreach:
‘I can remember being astonished to read that organizations such as the Milk Marketing Board, and whatever the equivalent is for eggs, would have access to some of the techniques’.
https://www.eff.org/deeplinks/2016/02/ipb-loopholes-within-loopholes
There’s no contradiction, extracting is different than unlocking
That the Feds have not commented about: Mobiles and their ecosystem evolved [or where managed to evolute] so that they became use addictive.
This works tremendously in favor of this industry, and against adequate overseeing, as is the case for drugs.
Any proposed change that reduces confidence in this supply, is going to be resisted, by consumers.
……..
No other technology had become so personal, also.
What an agent issuer is able to know about their captured users is unprecedented. Not even Stazi had this level of access to an individual’s privacy. [I’m talking of the agent issuers, you know. The ‘cloud OS’ managers].
The State -up until know- requested for a seat at that cloud, when necessary, then when convenient, then when wished.
Apple ethics [I hope] reached a limit.
On doing so, Apple [and not FBI] is framing the perfect storm for a long due advance on privacy legislation.
Being FBI a more civilian entity, could perceive a benefit to them, also.