Apple Archive

The Apple A15 SoC performance review: faster and more efficient

Apple’s iPhone component design seems to be limiting the SoC from achieving even better results, especially the newer Pro models, however even with that being said and done, Apple remains far above the competition in terms of performance and efficiency. Overall, while the A15 isn’t the brute force iteration we’ve become used to from Apple in recent years, it very much comes with substantial generational gains that allow it to be a notably better SoC than the A14. In the end, it seems like Apple’s SoC team has executed well after all. Apple’s SoC still rules the roost, and while there’s performance gains the A15, it’s in efficiency that the new SoC really shines.

Apple’s fortress of secrecy is crumbling from the inside

Apple’s remote work struggle is emblematic of a deeper shift taking place inside the company. Since 1976, the tech giant has operated in largely the same way: executives make decisions about how the company will function, and employees either fall in line or leave. What choice do they have? Apple is currently worth $2 trillion, making it the most valuable company in the world, as well as one of the most powerful. Over the past few months, however, that culture has started to erode. As workers across the tech industry advocate for more power, Apple’s top-down management seems more out of touch than ever before. Now, a growing number of employees are organizing internally for change and speaking out about working conditions on Twitter. Success tends to hide problems.

In public, Apple champions fighting climate change – behind closed doors, Apple lobbies against climate change legislation

In public, Apple claims it supports legislation to combat climate change. Jackson, now Apple’s VP for Environment, Policy, and Social Initiatives, released a statement asserting that “the urgent threat of climate change is a key priority” for the company. Jackson called on Congress and the Biden administration to take “urgent action” to pass “climate policies that quickly decarbonize our electric grid.” Specifically, Jackson said Apple supports “the enactment of a Clean Energy Standard (CES) that would decarbonize the power sector by 2035.” However, now that said standard is actually on the verge of being implemented, Apple, behind closed doors, is changing its tune. The goal of the Clean Energy Standard in the reconciliation package would be to reduce carbon emissions from the power sector by 80% by 2030 and 100% by 2035. It’s the precise policy that Jackson said Apple supported in her statement. Given this stance, you might be surprised that Apple is part of a “massive lobbying blitz” to kill the reconciliation package and its Clean Energy Standard. Why, then, is Apple now suddenly fighting the very standard it was championing? The ads focus on the funding mechanism for the package, which includes increasing the corporate tax rate by a few percentage points — from 21% to 26.5%. The rate would still be far lower than the 35% corporate tax rate in place prior to the 2017 tax cuts. In 2020, Apple had $67 billion in profits and an effective tax rate of 14.4%.  Ah, of course. Apple wants to be regarded as an environmentally responsible company, but only if it’s free and doesn’t cost them anything. Apple’s hypocrisy knows no bounds.

Apple delays rollout of controversial child safety features to make improvements

Apple, in a statement to various news outlets: Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features. Good step, but it should be scrapped entirely. Let’s hope this is not just a case of Apple waiting for the storm to blow over, to then sneak it into a random point release.

Apple just declared war on your privacy

Edward Snowden: Having read thousands upon thousands of remarks on this growing scandal, it has become clear to me that many understand it doesn’t matter, but few if any have been willing to actually say it. Speaking candidly, if that’s still allowed, that’s the way it always goes when someone of institutional significance launches a campaign to defend an indefensible intrusion into our private spaces. They make a mad dash to the supposed high ground, from which they speak in low, solemn tones about their moral mission before fervently invoking the dread spectre of the Four Horsemen of the Infopocalypse, warning that only a dubious amulet—or suspicious software update—can save us from the most threatening members of our species. Suddenly, everybody with a principled objection is forced to preface their concern with apologetic throat-clearing and the establishment of bonafides: I lost a friend when the towers came down, however… As a parent, I understand this is a real problem, but… An excellent and scathing takedown of Apple’s planned backdoors.

We built a system like Apple’s to flag child sexual abuse material – and concluded the tech was dangerous

Earlier this month, Apple unveiled a system that would scan iPhone and iPad photos for child sexual abuse material (CSAM). The announcement sparked a civil liberties firestorm, and Apple’s own employees have been expressing alarm. The company insists reservations about the system are rooted in “misunderstandings.” We disagree. We wrote the only peer-reviewed publication on how to build a system like Apple’s — and we concluded the technology was dangerous. We’re not concerned because we misunderstand how Apple’s system works. The problem is, we understand exactly how it works. There’s now so much evidence from credible, trustworthy people and organisations that Apple’s system is bad and dangerous, that I find it hard to believe there are still people cheering Apple on.

Researchers produce collision in Apple’s child-abuse hashing system

Researchers have produced a collision in iOS’s built-in hash function, raising new concerns about the integrity of Apple’s CSAM-scanning system. The flaw affects the hashing system, called NeuralHash, which allows Apple to check for exact matches of known child-abuse imagery without possessing any of the images or gleaning any information about non-matching pictures. On Tuesday, a GitHub user called Asuhariet Ygvar posted code for a reconstructed Python version of NeuralHash, which he claimed to have reverse-engineered from previous versions of iOS. The GitHub post also includes instructions on how to extract the NeuralMatch files from a current macOS or iOS build. Once the code was public, more significant attacks were quickly discovered. A user called Cory Cornelius produced a collision in the algorithm: two images that generate the same hash. If the findings hold up, it will be a significant failure in the cryptography underlying Apple’s new system. American tech media and bloggers have been shoving the valid concerns aside ever since Apple announced this new backdoor into iOS, and it’s barely been a week and we already see major tentpoles come crashing down. I try not to swear on OSNews, but there’s no other way to describe this than as a giant clusterfuck of epic proportions.

Apple’s child protection features spark concern within its own ranks

Apple employees have flooded an Apple internal Slack channel with more than 800 messages on the plan announced a week ago, workers who asked not to be identified told Reuters. Many expressed worries that the feature could be exploited by repressive governments looking to find other material for censorship or arrests, according to workers who saw the days-long thread. Past security changes at Apple have also prompted concern among employees, but the volume and duration of the new debate is surprising, the workers said. Some posters worried that Apple is damaging its leading reputation for protecting privacy. It’s a complete 180 from Apple’s behaviour and statements (in western markets) – of course employees are going to be worried. I’ve been warning for years that Apple’s position on privacy was nothing more than a marketing ploy, and now Apple employees, too, get a taste of their own medicine that they’ve been selling in China and various other totalitarian regimes.

One bad Apple

Dr. Neal Krawetz, one of the leading experts in the area of computer forensics research, digital photo analysis, and related topics, has penned a blog post in which he takes apart Apple’s recent announcement and the technology behind it. He actually has a lot of experience with the very problem Apple is trying to deal with, since he is the creator of FotoForensics, and files CSAM reports to the National Center for Missing and Exploited Children (NCMEC) every day. In fact, he files more reports than Apple, and knows all the ins and outs of all the technologies involved – including reverse-engineering Microsoft’s PhotoDNA, the perceptual hash algorithm NCMEC and Apple are using. The reason he had to reverse-engineer PhotoDNA is that NCMEC refused to countersign the NDA’s they wanted Krawetz to sign, eventually not responding to his requests altogether. Krawetz is one of the more prolific reporters of CSAM material (number 40 out of 168 in total in 2020). According to him, PhotoDNA is not as sophisticated as Apple’s and Microsoft’s documentation and claims make it out to be. Perhaps there is a reason that they don’t want really technical people looking at PhotoDNA. Microsoft says that the “PhotoDNA hash is not reversible”. That’s not true. PhotoDNA hashes can be projected into a 26×26 grayscale image that is only a little blurry. 26×26 is larger than most desktop icons; it’s enough detail to recognize people and objects. Reversing a PhotoDNA hash is no more complicated than solving a 26×26 Sudoku puzzle; a task well-suited for computers. The other major component of Apple’s system, an AI perceptual hash called a NeuralHash, is problematic too. The experts Apple cites have zero background in privacy or law, and while Apple’s whitepaper is “overly technical”, it “doesn’t give enough information for someone to confirm the implementation”. Furthermore, Krawetz “calls bullshit” on Apple’s claim that there is a 1 in 1 trillion error rate. After a detailed analysis of the numbers involved, he concludes: What is the real error rate? We don’t know. Apple doesn’t seem to know. And since they don’t know, they appear to have just thrown out a really big number. As far as I can tell, Apple’s claim of “1 in 1 trillion” is a baseless estimate. In this regard, Apple has provided misleading support for their algorithm and misleading accuracy rates. Krawetz also takes aim at the step where Apple manually reviews possible CP material by sending them from the device in question to Apple itself. After discussing this with his attorney, he concludes: The laws related to CSAM are very explicit. 18 U.S. Code § 2252 states that knowingly transferring CSAM material is a felony. (The only exception, in 2258A, is when it is reported to NCMEC.) In this case, Apple has a very strong reason to believe they are transferring CSAM material, and they are sending it to Apple — not NCMEC. It does not matter that Apple will then check it and forward it to NCMEC. 18 U.S.C. § 2258A is specific: the data can only be sent to NCMEC. (With 2258A, it is illegal for a service provider to turn over CP photos to the police or the FBI; you can only send it to NCMEC. Then NCMEC will contact the police or FBI.) What Apple has detailed is the intentional distribution (to Apple), collection (at Apple), and access (viewing at Apple) of material that they strongly have reason to believe is CSAM. As it was explained to me by my attorney, that is a felony. This whole thing looks, feels, and smells like a terribly designed system that is not only prone to errors, but also easily exploitable by people and governments with bad intentions. It also seems to be highly illegal, making one wonder why Apple were to put this out in the first place. Krawetz hints at why Apple is building this system earlier in this article: Apple’s devices rename pictures in a way that is very distinct. (Filename ballistics spots it really well.) Based on the number of reports that I’ve submitted to NCMEC, where the image appears to have touched Apple’s devices or services, I think that Apple has a very large CP/CSAM problem. I think this might be the real reason Apple is building this system.

Apple: critics of continuous iPhone photo scanning are “screeching voices of the minority”

Apple also addressed the hypothetical possibility of a particular region in the world deciding to corrupt a safety organization in an attempt to abuse the system, noting that the system’s first layer of protection is an undisclosed threshold before a user is flagged for having inappropriate imagery. Even if the threshold is exceeded, Apple said its manual review process would serve as an additional barrier and confirm the absence of known CSAM imagery. Apple said it would ultimately not report the flagged user to NCMEC or law enforcement agencies and that the system would still be working exactly as designed. After yesterday’s news and today’s responses from experts, here’s a recap: Apple is going to scan all photos on every iPhone to see if any of them match against a dataset of photos – that Apple itself hasn’t verified – given to them by the authorities of countries in which this is rolled out, with final checks being done by (third party) reviewers who are most likely traumatized, overworked, underpaid, and easily infiltrated. What could possibly go wrong? Today, Apple sent out an internal memo to Apple employees about this new scanning system. In it, they added a statement by Marita Rodriguez, executive director of strategic partnerships at the National Center for Missing and Exploited Children, and one of the choice quotes: I know it’s been a long day and that many of you probably haven’t slept in 24 hours. We know that the days to come will be filled with the screeching voices of the minority. Apple signed off on that quote. They think those of us worried about invasive technologies like this and the power backdoors like this would give to totalitarian regimes all over the world are the “screeching voices of the minority”. No wonder this company enjoys working with the most brutal regimes in the world.

An open letter against Apple’s privacy-invasive content scanning technology

A large number of security and privacy experts, legal experts, and more, in an open letter to Apple: On August 5th, 2021, Apple Inc. announced new technological measures meant to apply across virtually all of its devices under the umbrella of “Expanded Protections for Children”. While child exploitation is a serious problem, and while efforts to combat it are almost unquestionably well-intentioned, Apple’s proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products. The open letter contains tons of arguments, scenarios, and examples from experts about just how bad this technology is, and just how easily it can be abused.

The problem with perceptual hashes

Oliver Kuederle, who works with the image hashing technology used by Apple’s new technology that’s going to scan the photos on your iOS device continuously, explains that it is far, far from foolproof: Perceptual hashes are messy. The simple fact that image data is reduced to a small number of bits leads to collisions and therefore false positives. When such algorithms are used to detect criminal activities, especially at Apple scale, many innocent people can potentially face serious problems. My company’s customers are slightly inconvenienced by the failures of perceptual hashes (we have a UI in place that lets them make manual corrections). But when it comes to CSAM detection and its failure potential, that’s a whole different ball game. Needless to say, I’m quite worried about this. This is just one of the many, many problems with what Apple announced yesterday.

Apple’s plan to “Think Different” about encryption opens a backdoor to your private life

Apple has announced impending changes to its operating systems that include new “protections for children” features in iCloud and iMessage. If you’ve spent any time following the Crypto Wars, you know what this means: Apple is planning to build a backdoor into its data storage system and its messaging system. Child exploitation is a serious problem, and Apple isn’t the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy. Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor. Basically, Apple is going to scan your iCloud photo library, and compare cryptographic hashes of your photos to known photos containing child pornography. It’s hard to argue against this because it makes it seem as if you’re arguing against catching the sort of people that have such material. However, the issue with tools like this are not the ends – all of us are on the same side here – but the means. It’s more than obvious that this scanning is a gross invasion of privacy, but at the same time, you could easily argue that this is a bit of privacy we’d be willing to give up in order to aid in catching the worst elements of our society. The real problems stem from the fact that tools like this are simply never going to be foolproof. Software is incredibly unreliable, and while a random application crashing won’t ruin your life, an algorithm wrongfully labeling you as a pedophile most definitely will. On top of unintended consequences, malicious intent could be a major problem here too – what if some asshole wants to ruin your life, and sends you compromised photos, or otherwise sneaks them onto your device? And with Apple’s long history of working very closely with the most horrid regimes in the world, imagine what governments can do with a tool like this? On the ends that Apple is trying to get to here, we are all on the same side. The means to get there, however, need to be carefully considered.

Amazon just got Fakespot booted off Apple’s iOS App Store

Fakespot, known for its web browser extensions that try to weed out fake product reviews, suddenly no longer has an iPhone or iPad app — because Amazon sent Apple a takedown request, both Amazon and Fakespot confirm, and Apple decided to remove the app. The giant retailer says it was concerned about how a new update to the Fakespot app was “wrapping” its website without permission, and how that could be theoretically exploited to steal Amazon customer data. But Fakespot founder Saoud Khalifah tells The Verge that Apple abruptly removed the app today without any explanation. Apple didn’t respond to multiple requests for comment. Two abusive monopolists walk into a bar.

How Universal Control on macOS Monterey works

The best moment of this year’s WWDC keynote was a straightforward demo of a macOS feature, Universal Control. The idea is simple enough: it allows you to use the keyboard and trackpad on a Mac to directly control an iPad, and even makes it simple to drag and drop content between those devices. What made the demo so impressive is how easy and seamless it all seemed. In a classic Apple move, there was no setup required at all. The segment happened so fast that it even seemed (incorrectly, as it turns out) like the Mac was able to physically locate the iPad in space so it knew where to put the mouse pointer. I mean, none of this stuff is new or technologically impressive, but as usual, Apple manages to make it easy, intuitive, and look and feel good and nice. I’d love to have something as straightforward and integrated like this in Linux.

Apple will not roll out new privacy features in China and several other countries

Yesterday, during the Apple event, the company, as always, kept talking about they value privacy, and how privacy is a “fundamental human right”. A noble statement, of course, but it seems Apple does not consider people from China, Belarus, Colombia, Egypt, Kazakhstan, Saudi Arabia, South Africa, Turkmenistan, Uganda and the Philippines to be “humans”, because fundamental, tent pole privacy features announced yesterday will not be available to the humans living in those countries. Apple on Monday said a new “private relay” feature designed to obscure a user’s web browsing behavior from internet service providers and advertisers will not be available in China for regulatory reasons. The feature was one of a number of privacy protections Apple announced at its annual software developer conference on Monday, the latest in a years-long effort by the company to cut down on the tracking of its users by advertisers and other third parties. Privacy is a “fundamental human right”, but apparently not as fundamental as Apple’s right to make even more money.

Apple unveils macOS 12, iOS 15, iPadOS 15

Apple previewed macOS 12, iIOS 15 and iPadOS 15 yesterday. From MacRumors, one of the few remaining truly good Apple news websites: Apple today announced macOS 12, which it’s calling macOS Monterey. The new version of macOS is gaining features like Universal Control, AirPlay to Mac, and Shortcuts for Mac. Apple said that ‌macOS Monterey‌’s updates will help users get more done and work more fluidly across Apple devices. And iOS 15: Apple today previewed iOS 15, the company’s next major update for the iPhone, featuring new video calling capabilities, improvements to Messages, user statuses, a smart notification summary, and more. and iPadOS 15: Apple today unveiled iPadOS 15, its next-generation operating system for iPad that introduces a slew of new features like widgets on the Home Screen, an iPhone-style App Library, new multi-tasking features, and more. Here’s a rundown of what to expect. There’s no major tent pole features or drastic overhauls – instead, there’s a lot of smaller features and new additions that really do add up to what seem like three pretty major operating system releases. There should be something for everybody in here, but I do wonder which maniac approved the new tab bar design in Safari, because that behaviour should be a crime against humanity.

Apple’s tightly controlled App Store is teeming with scams

Of the highest 1,000 grossing apps on the App Store, nearly two percent are scams, according to an analysis by The Washington Post. And those apps have bilked consumers out of an estimated $48 million during the time they’ve been on the App Store, according to market research firm Appfigures. The scale of the problem has never before been reported. What’s more, Apple profits from these apps because it takes a cut of up to a 30 percent of all revenue generated through the App Store. Even more common, according to The Post’s analysis, are “fleeceware” apps that use inauthentic customer reviews to move up in the App Store rankings and give apps a sense of legitimacy to convince customers to pay higher prices for a service usually offered elsewhere with higher legitimate customer reviews. Apple likes to claim the App Store is needed to keep people safe, but that simply is a flat-out lie. The App Store is filled to the brim not only with obvious scams, but also a whole boatload of gambling applications designed specifically to trick children into spending money. In fact, these “games” make up a huge proportion of the App Store’s revenue. Apple earns top dollar from every scam or disturbing gambling app on the App Store, so there’s a huge conflict of interest here that in and of itself should be enough reason to take control over iOS away from Apple. iOS users should have the freedom to install and use an application store that does not prey on their children and promotes scams.

Censorship, surveillance and profits: a hard bargain for Apple in China

Blockbuster report by The New York Times on Apple and Tim Cook gladly making endless concessions to please the Chinese government. Nothing in here is really new to most of us, but it’s startling to see it laid out in such detail, and sourced so well. For instance, when it comes to Chinese people, privacy is apparently no longer a “fundamental human right“: Inside, Apple was preparing to store the personal data of its Chinese customers on computer servers run by a state-owned Chinese firm. Tim Cook, Apple’s chief executive, has said the data is safe. But at the data center in Guiyang, which Apple hoped would be completed by next month, and another in the Inner Mongolia region, Apple has largely ceded control to the Chinese government. Chinese state employees physically manage the computers. Apple abandoned the encryption technology it used elsewhere after China would not allow it. And the digital keys that unlock information on those computers are stored in the data centers they’re meant to secure. This means zero privacy for Chinese Apple users, as Apple has pretty much ceded all control over this data to the Chinese government – so much so Apple’s employees aren’t even in the building, and Apple no longer has the encryption keys either. And on top of this, it turns out Apple is so scared of offending the Chinese government, the company proactively censors applications and other content in the Chinese version of the App Store, removing, censoring, and blocking content even before the Chinese government asks for it. “Apple has become a cog in the censorship machine that presents a government-controlled version of the internet,” said Nicholas Bequelin, Asia director for Amnesty International, the human rights group. “If you look at the behavior of the Chinese government, you don’t see any resistance from Apple — no history of standing up for the principles that Apple claims to be so attached to.” Apple even fired an App Store reviewer because the reviewer approved an application that while not breaking a single rule, did offend the Chinese government. That is how far Apple is willing to go to please its Chinese government friends. Apple isn’t merely beholden to China – it’s deeply, deeply afraid of China. How many more concessions is Tim Cook willing to make, and how many more Chinese rings is he willing to kiss?