Apple Archive

One bad Apple

Dr. Neal Krawetz, one of the leading experts in the area of computer forensics research, digital photo analysis, and related topics, has penned a blog post in which he takes apart Apple’s recent announcement and the technology behind it. He actually has a lot of experience with the very problem Apple is trying to deal with, since he is the creator of FotoForensics, and files CSAM reports to the National Center for Missing and Exploited Children (NCMEC) every day. In fact, he files more reports than Apple, and knows all the ins and outs of all the technologies involved – including reverse-engineering Microsoft’s PhotoDNA, the perceptual hash algorithm NCMEC and Apple are using. The reason he had to reverse-engineer PhotoDNA is that NCMEC refused to countersign the NDA’s they wanted Krawetz to sign, eventually not responding to his requests altogether. Krawetz is one of the more prolific reporters of CSAM material (number 40 out of 168 in total in 2020). According to him, PhotoDNA is not as sophisticated as Apple’s and Microsoft’s documentation and claims make it out to be. Perhaps there is a reason that they don’t want really technical people looking at PhotoDNA. Microsoft says that the “PhotoDNA hash is not reversible”. That’s not true. PhotoDNA hashes can be projected into a 26×26 grayscale image that is only a little blurry. 26×26 is larger than most desktop icons; it’s enough detail to recognize people and objects. Reversing a PhotoDNA hash is no more complicated than solving a 26×26 Sudoku puzzle; a task well-suited for computers. The other major component of Apple’s system, an AI perceptual hash called a NeuralHash, is problematic too. The experts Apple cites have zero background in privacy or law, and while Apple’s whitepaper is “overly technical”, it “doesn’t give enough information for someone to confirm the implementation”. Furthermore, Krawetz “calls bullshit” on Apple’s claim that there is a 1 in 1 trillion error rate. After a detailed analysis of the numbers involved, he concludes: What is the real error rate? We don’t know. Apple doesn’t seem to know. And since they don’t know, they appear to have just thrown out a really big number. As far as I can tell, Apple’s claim of “1 in 1 trillion” is a baseless estimate. In this regard, Apple has provided misleading support for their algorithm and misleading accuracy rates. Krawetz also takes aim at the step where Apple manually reviews possible CP material by sending them from the device in question to Apple itself. After discussing this with his attorney, he concludes: The laws related to CSAM are very explicit. 18 U.S. Code § 2252 states that knowingly transferring CSAM material is a felony. (The only exception, in 2258A, is when it is reported to NCMEC.) In this case, Apple has a very strong reason to believe they are transferring CSAM material, and they are sending it to Apple — not NCMEC. It does not matter that Apple will then check it and forward it to NCMEC. 18 U.S.C. § 2258A is specific: the data can only be sent to NCMEC. (With 2258A, it is illegal for a service provider to turn over CP photos to the police or the FBI; you can only send it to NCMEC. Then NCMEC will contact the police or FBI.) What Apple has detailed is the intentional distribution (to Apple), collection (at Apple), and access (viewing at Apple) of material that they strongly have reason to believe is CSAM. As it was explained to me by my attorney, that is a felony. This whole thing looks, feels, and smells like a terribly designed system that is not only prone to errors, but also easily exploitable by people and governments with bad intentions. It also seems to be highly illegal, making one wonder why Apple were to put this out in the first place. Krawetz hints at why Apple is building this system earlier in this article: Apple’s devices rename pictures in a way that is very distinct. (Filename ballistics spots it really well.) Based on the number of reports that I’ve submitted to NCMEC, where the image appears to have touched Apple’s devices or services, I think that Apple has a very large CP/CSAM problem. I think this might be the real reason Apple is building this system.

Apple: critics of continuous iPhone photo scanning are “screeching voices of the minority”

Apple also addressed the hypothetical possibility of a particular region in the world deciding to corrupt a safety organization in an attempt to abuse the system, noting that the system’s first layer of protection is an undisclosed threshold before a user is flagged for having inappropriate imagery. Even if the threshold is exceeded, Apple said its manual review process would serve as an additional barrier and confirm the absence of known CSAM imagery. Apple said it would ultimately not report the flagged user to NCMEC or law enforcement agencies and that the system would still be working exactly as designed. After yesterday’s news and today’s responses from experts, here’s a recap: Apple is going to scan all photos on every iPhone to see if any of them match against a dataset of photos – that Apple itself hasn’t verified – given to them by the authorities of countries in which this is rolled out, with final checks being done by (third party) reviewers who are most likely traumatized, overworked, underpaid, and easily infiltrated. What could possibly go wrong? Today, Apple sent out an internal memo to Apple employees about this new scanning system. In it, they added a statement by Marita Rodriguez, executive director of strategic partnerships at the National Center for Missing and Exploited Children, and one of the choice quotes: I know it’s been a long day and that many of you probably haven’t slept in 24 hours. We know that the days to come will be filled with the screeching voices of the minority. Apple signed off on that quote. They think those of us worried about invasive technologies like this and the power backdoors like this would give to totalitarian regimes all over the world are the “screeching voices of the minority”. No wonder this company enjoys working with the most brutal regimes in the world.

An open letter against Apple’s privacy-invasive content scanning technology

A large number of security and privacy experts, legal experts, and more, in an open letter to Apple: On August 5th, 2021, Apple Inc. announced new technological measures meant to apply across virtually all of its devices under the umbrella of “Expanded Protections for Children”. While child exploitation is a serious problem, and while efforts to combat it are almost unquestionably well-intentioned, Apple’s proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products. The open letter contains tons of arguments, scenarios, and examples from experts about just how bad this technology is, and just how easily it can be abused.

The problem with perceptual hashes

Oliver Kuederle, who works with the image hashing technology used by Apple’s new technology that’s going to scan the photos on your iOS device continuously, explains that it is far, far from foolproof: Perceptual hashes are messy. The simple fact that image data is reduced to a small number of bits leads to collisions and therefore false positives. When such algorithms are used to detect criminal activities, especially at Apple scale, many innocent people can potentially face serious problems. My company’s customers are slightly inconvenienced by the failures of perceptual hashes (we have a UI in place that lets them make manual corrections). But when it comes to CSAM detection and its failure potential, that’s a whole different ball game. Needless to say, I’m quite worried about this. This is just one of the many, many problems with what Apple announced yesterday.

Apple’s plan to “Think Different” about encryption opens a backdoor to your private life

Apple has announced impending changes to its operating systems that include new “protections for children” features in iCloud and iMessage. If you’ve spent any time following the Crypto Wars, you know what this means: Apple is planning to build a backdoor into its data storage system and its messaging system. Child exploitation is a serious problem, and Apple isn’t the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy. Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor. Basically, Apple is going to scan your iCloud photo library, and compare cryptographic hashes of your photos to known photos containing child pornography. It’s hard to argue against this because it makes it seem as if you’re arguing against catching the sort of people that have such material. However, the issue with tools like this are not the ends – all of us are on the same side here – but the means. It’s more than obvious that this scanning is a gross invasion of privacy, but at the same time, you could easily argue that this is a bit of privacy we’d be willing to give up in order to aid in catching the worst elements of our society. The real problems stem from the fact that tools like this are simply never going to be foolproof. Software is incredibly unreliable, and while a random application crashing won’t ruin your life, an algorithm wrongfully labeling you as a pedophile most definitely will. On top of unintended consequences, malicious intent could be a major problem here too – what if some asshole wants to ruin your life, and sends you compromised photos, or otherwise sneaks them onto your device? And with Apple’s long history of working very closely with the most horrid regimes in the world, imagine what governments can do with a tool like this? On the ends that Apple is trying to get to here, we are all on the same side. The means to get there, however, need to be carefully considered.

Amazon just got Fakespot booted off Apple’s iOS App Store

Fakespot, known for its web browser extensions that try to weed out fake product reviews, suddenly no longer has an iPhone or iPad app — because Amazon sent Apple a takedown request, both Amazon and Fakespot confirm, and Apple decided to remove the app. The giant retailer says it was concerned about how a new update to the Fakespot app was “wrapping” its website without permission, and how that could be theoretically exploited to steal Amazon customer data. But Fakespot founder Saoud Khalifah tells The Verge that Apple abruptly removed the app today without any explanation. Apple didn’t respond to multiple requests for comment. Two abusive monopolists walk into a bar.

How Universal Control on macOS Monterey works

The best moment of this year’s WWDC keynote was a straightforward demo of a macOS feature, Universal Control. The idea is simple enough: it allows you to use the keyboard and trackpad on a Mac to directly control an iPad, and even makes it simple to drag and drop content between those devices. What made the demo so impressive is how easy and seamless it all seemed. In a classic Apple move, there was no setup required at all. The segment happened so fast that it even seemed (incorrectly, as it turns out) like the Mac was able to physically locate the iPad in space so it knew where to put the mouse pointer. I mean, none of this stuff is new or technologically impressive, but as usual, Apple manages to make it easy, intuitive, and look and feel good and nice. I’d love to have something as straightforward and integrated like this in Linux.

Apple will not roll out new privacy features in China and several other countries

Yesterday, during the Apple event, the company, as always, kept talking about they value privacy, and how privacy is a “fundamental human right”. A noble statement, of course, but it seems Apple does not consider people from China, Belarus, Colombia, Egypt, Kazakhstan, Saudi Arabia, South Africa, Turkmenistan, Uganda and the Philippines to be “humans”, because fundamental, tent pole privacy features announced yesterday will not be available to the humans living in those countries. Apple on Monday said a new “private relay” feature designed to obscure a user’s web browsing behavior from internet service providers and advertisers will not be available in China for regulatory reasons. The feature was one of a number of privacy protections Apple announced at its annual software developer conference on Monday, the latest in a years-long effort by the company to cut down on the tracking of its users by advertisers and other third parties. Privacy is a “fundamental human right”, but apparently not as fundamental as Apple’s right to make even more money.

Apple unveils macOS 12, iOS 15, iPadOS 15

Apple previewed macOS 12, iIOS 15 and iPadOS 15 yesterday. From MacRumors, one of the few remaining truly good Apple news websites: Apple today announced macOS 12, which it’s calling macOS Monterey. The new version of macOS is gaining features like Universal Control, AirPlay to Mac, and Shortcuts for Mac. Apple said that ‌macOS Monterey‌’s updates will help users get more done and work more fluidly across Apple devices. And iOS 15: Apple today previewed iOS 15, the company’s next major update for the iPhone, featuring new video calling capabilities, improvements to Messages, user statuses, a smart notification summary, and more. and iPadOS 15: Apple today unveiled iPadOS 15, its next-generation operating system for iPad that introduces a slew of new features like widgets on the Home Screen, an iPhone-style App Library, new multi-tasking features, and more. Here’s a rundown of what to expect. There’s no major tent pole features or drastic overhauls – instead, there’s a lot of smaller features and new additions that really do add up to what seem like three pretty major operating system releases. There should be something for everybody in here, but I do wonder which maniac approved the new tab bar design in Safari, because that behaviour should be a crime against humanity.

Apple’s tightly controlled App Store is teeming with scams

Of the highest 1,000 grossing apps on the App Store, nearly two percent are scams, according to an analysis by The Washington Post. And those apps have bilked consumers out of an estimated $48 million during the time they’ve been on the App Store, according to market research firm Appfigures. The scale of the problem has never before been reported. What’s more, Apple profits from these apps because it takes a cut of up to a 30 percent of all revenue generated through the App Store. Even more common, according to The Post’s analysis, are “fleeceware” apps that use inauthentic customer reviews to move up in the App Store rankings and give apps a sense of legitimacy to convince customers to pay higher prices for a service usually offered elsewhere with higher legitimate customer reviews. Apple likes to claim the App Store is needed to keep people safe, but that simply is a flat-out lie. The App Store is filled to the brim not only with obvious scams, but also a whole boatload of gambling applications designed specifically to trick children into spending money. In fact, these “games” make up a huge proportion of the App Store’s revenue. Apple earns top dollar from every scam or disturbing gambling app on the App Store, so there’s a huge conflict of interest here that in and of itself should be enough reason to take control over iOS away from Apple. iOS users should have the freedom to install and use an application store that does not prey on their children and promotes scams.

Censorship, surveillance and profits: a hard bargain for Apple in China

Blockbuster report by The New York Times on Apple and Tim Cook gladly making endless concessions to please the Chinese government. Nothing in here is really new to most of us, but it’s startling to see it laid out in such detail, and sourced so well. For instance, when it comes to Chinese people, privacy is apparently no longer a “fundamental human right“: Inside, Apple was preparing to store the personal data of its Chinese customers on computer servers run by a state-owned Chinese firm. Tim Cook, Apple’s chief executive, has said the data is safe. But at the data center in Guiyang, which Apple hoped would be completed by next month, and another in the Inner Mongolia region, Apple has largely ceded control to the Chinese government. Chinese state employees physically manage the computers. Apple abandoned the encryption technology it used elsewhere after China would not allow it. And the digital keys that unlock information on those computers are stored in the data centers they’re meant to secure. This means zero privacy for Chinese Apple users, as Apple has pretty much ceded all control over this data to the Chinese government – so much so Apple’s employees aren’t even in the building, and Apple no longer has the encryption keys either. And on top of this, it turns out Apple is so scared of offending the Chinese government, the company proactively censors applications and other content in the Chinese version of the App Store, removing, censoring, and blocking content even before the Chinese government asks for it. “Apple has become a cog in the censorship machine that presents a government-controlled version of the internet,” said Nicholas Bequelin, Asia director for Amnesty International, the human rights group. “If you look at the behavior of the Chinese government, you don’t see any resistance from Apple — no history of standing up for the principles that Apple claims to be so attached to.” Apple even fired an App Store reviewer because the reviewer approved an application that while not breaking a single rule, did offend the Chinese government. That is how far Apple is willing to go to please its Chinese government friends. Apple isn’t merely beholden to China – it’s deeply, deeply afraid of China. How many more concessions is Tim Cook willing to make, and how many more Chinese rings is he willing to kiss?

iOS 14.5, macOS 11.3 released

iOS 14.5 is a major update with a long list of new features, including the ability to unlock an iPhone with an Apple Watch, 5G support for dual-SIM users, new emoji characters, an option to select a preferred music service to use with Siri, crowd sourced data collection for Apple Maps accidents, AirPlay 2 support for Fitness+, and much more. The update also introduces support for AirTags and Precision Finding on the iPhone 12 models, and it marks the official introduction of App Tracking Transparency. There are a long list of bug fixes, with Apple addressing everything from AirPods switching issues to the green tint that some users saw on ‌iPhone 12‌ models. A big update for such a small version number, and a lot of good stuff in there. Apple also released macOS Big Sur 11.3, which is a smaller update than the iOS one, but still contains some nice additions such as better touch integration for running iOS apps on the Mac and improved support for game controllers.

Apple announces new iMac with M1 chip and seven color options

Apple has announced a new, redesigned 24-inch iMac, featuring an M1 chip, a 4.5K display, and a range of color options, as well as an improved cooling system, front-facing camera, speaker system, microphones, power connector, and peripherals. These look pretty good, but they come with the same limitations as all the other identical M1 Macs – 8 GB of RAM standard with a maximum of a mere 16 GB, lacklustre graphics chip, no high refresh rate displays (in 2021!), barely any ports, zero expandability, and Linux/BSD support will always remain problematic and years behind the curve. Good processor, but at what cost?

Discord will block NSFW servers on iOS

Entire servers can now be marked as NSFW if their community “is organized around NSFW themes or if the majority of the server’s content is 18+.” This label will be a requirement going forward, and Discord will proactively mark servers as NSFW if they fail to self-identify. Discord previously allowed individual channels to be marked as NSFW and age-gated. The NSFW marker does two things. First, it prevents anyone under the age of 18 from joining. But the bigger limitation is that it prevents NSFW servers from being accessed on iOS devices — a significant restriction that’s almost certainly meant to cater to Apple’s strict and often prudish rules around nudity in services distributed through the App Store. Tumblr infamously wiped porn from its entire platform in order to come into compliance with Apple’s rules. There’s two things happening here. There’s the tighter restrictions by Discord, which I think are reasonable – you don’t want minors or adults who simply aren’t interested in rowdier conversations to accidentally walk into channels where people are discussing sex, nudity, or porn. Labeling these channels as such is, while not a panacea, an understandable move, also from a legal standpoint. I still think sex and nudity are far, far, far less damaging or worrisome than the insane amounts of brutal violence children get exposed to in movies, TV series, games, and the evening news, but I understand American culture sees these things differently. Then there’s Apple’s demands placed on Discord. This is an absolutely bizarre move by Apple on so many levels. First, the line between porn and mere nudity is often vague and nebulous, such as in paintings or others forms of art. This could be hugely impactful to art communities sharing the things they work on. Second, Discord is primarily a platform for close-knit groups of friends, and if everyone in your friend group is over 18, there’s going to be discussions and talk about sex, nudity, porn, and other things adults tend to talk about from time to time, just as there are in real life. None of these two – art and casual conversation – are criminal, bad, or negative in any way. And third, and this is the big one, these restrictions Apple is placing on Discord do not apply to Apple’s own applications. iMessage serves much the same function as Discord does, yet there’s no NSFW markers, 18+ warnings, or bans on such content on iMessage. Other platforms, such as Facebook, Twitter, and the damn web through Safari, provide access to far vaster collections of the most degenerate pornography mankind ever wrought, and yet, Apple isn’t banning them either. This just goes to show that, once again, that iPhone isn’t really yours. Apple decides how you get to use it, and you’re merely along for the $1000 ride. Android may have its problems, but at least I don’t have Tim Cook peeping over my shoulder to see if I’m looking at something he deems lewd.

Apple’s cooperation with authoritarian governments

Over the past few years, Apple seems increasingly willing to cooperate with authoritarian governments, uninterested in protecting its own users, and unwilling to actually standup for human rights in broad terms, as often portrayed by its marketing department or direct statements from CEO Tim Cook. The company is quick to position itself as a prominent human rights advocate in the corporate world, especially regarding issues like user privacy and security. Although, as Ole Begemann has aptly pointed out, this is increasingly disingenuous to the point of deliberately deceiving its customers and the general public. There are even (unconfirmed) reports that the lack of end-to-end encryption that Ole criticizes is actually due to willful coordination and cooperation with the FBI. And like most companies in the industry, Apple employs a highly problematic supply chain, which makes its human rights crusade seem even less authentic. A good overview of Apple’s and Tim Cook’s incredibly close ties with genocidal, totalitarian regimes, and how the company seems to have zero issues selling out their users as long as they’re not in the west. I guess for Apple and Tim Cook, western lives simply matter more.

Apple agrees to offer government-approved pre-installed apps for devices in Russia

According to the report, citing a source within the Ministry, Apple struck a deal with the government that will show users a prompt when first configuring a device in Russia to pre-install apps from a list of government-approved software. Users will have the ability to decline the installation of certain apps. The new legislation is an amendment to the existing “On Consumer Protection” law that will require the pre-installation of software on all devices sold in Russia, including smartphones, tablets, laptops, desktops, and smart TVs. The pre-installed software will include antivirus and cartographic apps, social media apps, and “Public Service” apps for payments and civil services. Apple bending over backwards to please Putin’s totalitarian regime will open the (back)door to countless other governments – western or not – demanding the same thing. As always, it seems Apple only cares about privacy and user experience if they can pull the wool over the eyes of gullible westerners – but as soon as the choice comes down to money or values, Tim Cook is jumping at the opportunity dump his proclaimed values in a ditch by the side of the road. Speaking of bending over backwards to please totalitarian regimes and dumping proclaimed values in a ditch by the side of the road, Tim Cook will attend the Chinese government’s China Development Forum, despite the ongoing Uighur genocide and crackdown on the democratic rights of the citizens of Hong Kong. Classy move, Tim, but then, anybody with even a modicum of pattern recognition skills is not surprised by your never-ending quest to please dictators.

Apple M1 microarchitecture research

This is an early attempt at microarchitecture documentation for the CPU in the Apple M1, inspired by and building on the amazing work of Andreas Abel, Andrei Frumusanu, @Veedrac, Travis Downs, Henry Wong and Agner Fog. This documentation is my best effort, but it is based on black-box reverse engineering, and there are definitely mistakes. No warranty of any kind (and not just as a legal technicality). To make it easier to verify the information and/or identify such errors, entries in the instruction tables link to the experiments and results (~35k tables of counter values). Amazing work, but the fact this kind of work is even needed illustrates just how anti-consumer these new Macs really are.

The Macintosh Application Environment

Thanks to Twitter, here’s an interesting footnote in computing history. As A/UX development was winding down, Apple was working on another project called the Macintosh Application Environment. This was an emulator that allowed users to run Mac software under Sun’s Solaris or Hewlett Packard’s HP-UX. A great deal of A/UX technology went into the design of this ill-fated product. This page is a pictorial tribute to the Macintosh Application Environment, running under Solaris 8 on an Ultra 10 workstation. If you want to try the MAE, you’ll need a Sun box running Solaris 9 or below – The software does not appear to work under Solaris 10. This is absolutely fascinating, and I had no idea this existed.

Apple’s App Store is hosting multimillion-dollar scams, says this iOS developer

Mobile app developer Kosta Eleftheriou has a new calling that goes beyond software development: taking on what he sees as a rampant scam problem ruining the integrity of Apple’s App Store. Eleftheriou, who created the successful Apple Watch keyboard app FlickType, has for the last two weeks been publicly criticizing Apple for lax enforcement of its App Store rules that have allowed scam apps, as well as apps that clone popular software from other developers, to run rampant. These apps enjoy top billing in the iPhone marketplace, all thanks to glowing reviews and sterling five-star ratings that are largely fabricated, he says. I’ve been saying it for ten years: the application store model is fundamentally broken, because the owner of the application store benefits from people gaming and cheating the system. In this case, Apple profits from every scam application or subscription sold, and since the App Store constitutes a huge part of Apple’s all-important services revenue, Apple has no incentive to really tackle issues like this. Here’s what going to happen, based on my immutable pattern recognition skills: there will be more press outcry over this developer’s specific issue until Apple eventually sends out a public apology statement and sort-of addresses this specific issue. American tech media – which are deeply embedded in Apple’s ecosystem and depend on being in Apple’s good graces – will praise Apple’s response, and claim the situation has been resolved. Their next batch of review units and press invites from Apple are on their way. And a few weeks or months later, another developer suffers from the same or similar issues, rinse, repeat. The problem is not individual App Store rules or App Store reviewers having a bad day – the paradigm itself is fundamentally broken, and until the tech industry and us as users come to terms with that, these repetitive stories will keep popping up, faux press outrage and all.