Apple Archive

Women force change at Indian iPhone plant, sick from bad food, crowded dorms

For women who assembled iPhones at a Foxconn plant in southern India, crowded dorms without flush toilets and food sometimes crawling with worms were problems to be endured for the paycheck. But when tainted food sickened over 250 of the workers their anger boiled over, culminating in a rare protest that shut down a plant where 17,000 had been working. Just in case you thought Apple (and other companies, of course) wouldn’t exploit poor people of colour in countries other than China. Good on these women for standing up for their rights, which is at least something they can do that their counterparts in totalitarian China cannot.

Apple CEO Tim Cook ‘secretly’ signed $275 billion deal with China in 2016

In an extensive paywalled report based on interviews and purported internal Apple documents, The Information revealed that Tim Cook personally forged a five-year agreement with the Chinese government during a series of in-person visits to the country in 2016. The need to push for a closer alliance with the Chinese government reportedly came from a number of Apple executives who were concerned about bad publicity in China and the company’s poor relationship with Chinese officials, who believed that Apple was not contributing enough to the local economy. Alleged internal documents show that Cook “personally lobbied officials” in China over threats made against Apple Pay, iCloud, and the App Store. Cook set out to use a “memorandum of understanding” between Apple and a powerful Chinese government agency called the National Development and Reform Commission to formally agree to a number of concessions in return for regulatory exemptions. The 1,250-word agreement was written by Apple’s government affairs team in China and stewarded by Cook as he met with Chinese officials. It was already well-known that Tim Cook and Apple were closely cooperating with the Chinese regime, but it seems they even went as far as begging and groveling to work with the Chinese regime in incredibly close ways. Mind you, that same regime Apple is so keen to closely cooperate with and please is currently executing a genocide to ethnically cleanse China. I’ve heard all the spineless corporatist excuses a million times. “Apple is just following Chinese law!” No. “Vote with your wallet!” No. “It’s not illegal so who cares if they aid a genocidal regime!” No. We throw minorities in jail for carrying a few grams of drugs, but we let corporations and executives who plot and scheme with genocidal regimes run free. Is that justice? We have devolved into a society where we just accept this – and that worries me just as much as all the other existential threats we’re facing.

Apple announces self service repair program, starting with iPhone 12 and 13

The Self Service Repair program will give customers who are comfortable with the idea of completing their own repairs access to Apple genuine parts, tools, and manuals, starting with the iPhone 12 and iPhone 13 lineups. The scheme will be introduced in phases, adding more repairs and supported devices over time. This is a major win for right-to-repair, and I’m very happy Aplpe caved to regulatory, shareholder, and public pressure. Momentum behind right-to-repair has been growing for years now, and it’s satisfying to see it bear fruit. Of course, we’ll have to wait and see if there’s any catch – insane NDAs, crazy high prices, little to no stock – but if not, this could be a model for other companies to follow.

Apple’s new M1 Pro and M1 Max processors take its in-house Arm-based chips to new heights

For the M1 Pro, Apple promises 70 percent better CPU performance and twice the graphics performance compared to the M1. While the basic architecture is still the same on the M1 Pro, Apple is upping the hardware here in a big way, with a 10-core CPU that offers eight performance cores and two efficiency cores, along with a 16-core GPU with 2,048 execution units. The new chip also supports more RAM, with configuration options up to 32GB (although, like the M1, memory is still integrated directly into the chip itself, instead of user-upgradable) with 200GB/s memory bandwidth. In total, the M1 Pro has 33.7 billion transistors, roughly twice the number of transistors that the M1 has. But Apple isn’t stopping there: it also announced the even more powerful M1 Max, which has the same 10-core CPU configuration, with eight performance cores and two efficiency cores. But the M1 Max doubles the memory bandwidth (to 400GB/s), RAM (up to 64GB of memory) and GPU (with 32 cores, 4,096 execution units and four times the GPU performance of the original M1). The M1 Max features 57 billion transistors, making it the largest chip that Apple has made yet. The new chip also means that you can connect up to four external displays to a single device. These are absolutely bonkers chips for in a laptop, and Apple once again puts the entire industry on notice. There’s nothing Intel, AMD, or Qualcomm can offer that comes even close to these new M1 Pro and Max chips, and Apple even seems to have managed to get within spitting distance of a laptop RTX 3080. It’s hard to fathom just how impressive these are. The laptops they come in the new 14″ and 16″ MacBook Pro, with a new design that, for some reason, includes a notch even though there’s no FaceID. Apple is easily the best choice for most people when it comes to laptops now, since anything else will be just as expensive, but far, far less performant with far worse energy use.

Insiders in Apple’s healthcare organization say its leaders suppress concerns and mislead executives

It’s a symptom of what insiders say are deeper organizational problems that have left the health group without clear direction and struggling to mesh Apple’s hardware-oriented culture with the practices of the medical business. People at Apple Health said that they saw colleagues face retribution for disagreeing with superiors and that concerns have been expressed on more than one occasion about the way health data is used to develop products. The situation has gotten so serious that some employees have lodged complaints with Apple’s most senior executives, including Cook and Chief Operating Officer Jeff Williams, who oversees the health effort. Success tends to hide problems.

The Apple A15 SoC performance review: faster and more efficient

Apple’s iPhone component design seems to be limiting the SoC from achieving even better results, especially the newer Pro models, however even with that being said and done, Apple remains far above the competition in terms of performance and efficiency. Overall, while the A15 isn’t the brute force iteration we’ve become used to from Apple in recent years, it very much comes with substantial generational gains that allow it to be a notably better SoC than the A14. In the end, it seems like Apple’s SoC team has executed well after all. Apple’s SoC still rules the roost, and while there’s performance gains the A15, it’s in efficiency that the new SoC really shines.

Apple’s fortress of secrecy is crumbling from the inside

Apple’s remote work struggle is emblematic of a deeper shift taking place inside the company. Since 1976, the tech giant has operated in largely the same way: executives make decisions about how the company will function, and employees either fall in line or leave. What choice do they have? Apple is currently worth $2 trillion, making it the most valuable company in the world, as well as one of the most powerful. Over the past few months, however, that culture has started to erode. As workers across the tech industry advocate for more power, Apple’s top-down management seems more out of touch than ever before. Now, a growing number of employees are organizing internally for change and speaking out about working conditions on Twitter. Success tends to hide problems.

In public, Apple champions fighting climate change – behind closed doors, Apple lobbies against climate change legislation

In public, Apple claims it supports legislation to combat climate change. Jackson, now Apple’s VP for Environment, Policy, and Social Initiatives, released a statement asserting that “the urgent threat of climate change is a key priority” for the company. Jackson called on Congress and the Biden administration to take “urgent action” to pass “climate policies that quickly decarbonize our electric grid.” Specifically, Jackson said Apple supports “the enactment of a Clean Energy Standard (CES) that would decarbonize the power sector by 2035.” However, now that said standard is actually on the verge of being implemented, Apple, behind closed doors, is changing its tune. The goal of the Clean Energy Standard in the reconciliation package would be to reduce carbon emissions from the power sector by 80% by 2030 and 100% by 2035. It’s the precise policy that Jackson said Apple supported in her statement. Given this stance, you might be surprised that Apple is part of a “massive lobbying blitz” to kill the reconciliation package and its Clean Energy Standard. Why, then, is Apple now suddenly fighting the very standard it was championing? The ads focus on the funding mechanism for the package, which includes increasing the corporate tax rate by a few percentage points — from 21% to 26.5%. The rate would still be far lower than the 35% corporate tax rate in place prior to the 2017 tax cuts. In 2020, Apple had $67 billion in profits and an effective tax rate of 14.4%.  Ah, of course. Apple wants to be regarded as an environmentally responsible company, but only if it’s free and doesn’t cost them anything. Apple’s hypocrisy knows no bounds.

Apple releases iOS 15, iPadOS 15, watchOS 8, HomePod 15, tvOS 15

Apple has released new versions of all of its platforms, with only the Mac lagging behind for now. There’s iOS 15 and iPadOS 15, watchOS 8, HomePod 15, and tvOS 15. As usual, Apple’s device support for new updates is excellent and stretches back quite far, so pretty much every one of you who is an iOS users will be able to enjoy these new releases. You know where to find them.

Apple delays rollout of controversial child safety features to make improvements

Apple, in a statement to various news outlets: Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features. Good step, but it should be scrapped entirely. Let’s hope this is not just a case of Apple waiting for the storm to blow over, to then sneak it into a random point release.

Apple just declared war on your privacy

Edward Snowden: Having read thousands upon thousands of remarks on this growing scandal, it has become clear to me that many understand it doesn’t matter, but few if any have been willing to actually say it. Speaking candidly, if that’s still allowed, that’s the way it always goes when someone of institutional significance launches a campaign to defend an indefensible intrusion into our private spaces. They make a mad dash to the supposed high ground, from which they speak in low, solemn tones about their moral mission before fervently invoking the dread spectre of the Four Horsemen of the Infopocalypse, warning that only a dubious amulet—or suspicious software update—can save us from the most threatening members of our species. Suddenly, everybody with a principled objection is forced to preface their concern with apologetic throat-clearing and the establishment of bonafides: I lost a friend when the towers came down, however… As a parent, I understand this is a real problem, but… An excellent and scathing takedown of Apple’s planned backdoors.

We built a system like Apple’s to flag child sexual abuse material – and concluded the tech was dangerous

Earlier this month, Apple unveiled a system that would scan iPhone and iPad photos for child sexual abuse material (CSAM). The announcement sparked a civil liberties firestorm, and Apple’s own employees have been expressing alarm. The company insists reservations about the system are rooted in “misunderstandings.” We disagree. We wrote the only peer-reviewed publication on how to build a system like Apple’s — and we concluded the technology was dangerous. We’re not concerned because we misunderstand how Apple’s system works. The problem is, we understand exactly how it works. There’s now so much evidence from credible, trustworthy people and organisations that Apple’s system is bad and dangerous, that I find it hard to believe there are still people cheering Apple on.

Researchers produce collision in Apple’s child-abuse hashing system

Researchers have produced a collision in iOS’s built-in hash function, raising new concerns about the integrity of Apple’s CSAM-scanning system. The flaw affects the hashing system, called NeuralHash, which allows Apple to check for exact matches of known child-abuse imagery without possessing any of the images or gleaning any information about non-matching pictures. On Tuesday, a GitHub user called Asuhariet Ygvar posted code for a reconstructed Python version of NeuralHash, which he claimed to have reverse-engineered from previous versions of iOS. The GitHub post also includes instructions on how to extract the NeuralMatch files from a current macOS or iOS build. Once the code was public, more significant attacks were quickly discovered. A user called Cory Cornelius produced a collision in the algorithm: two images that generate the same hash. If the findings hold up, it will be a significant failure in the cryptography underlying Apple’s new system. American tech media and bloggers have been shoving the valid concerns aside ever since Apple announced this new backdoor into iOS, and it’s barely been a week and we already see major tentpoles come crashing down. I try not to swear on OSNews, but there’s no other way to describe this than as a giant clusterfuck of epic proportions.

Apple’s child protection features spark concern within its own ranks

Apple employees have flooded an Apple internal Slack channel with more than 800 messages on the plan announced a week ago, workers who asked not to be identified told Reuters. Many expressed worries that the feature could be exploited by repressive governments looking to find other material for censorship or arrests, according to workers who saw the days-long thread. Past security changes at Apple have also prompted concern among employees, but the volume and duration of the new debate is surprising, the workers said. Some posters worried that Apple is damaging its leading reputation for protecting privacy. It’s a complete 180 from Apple’s behaviour and statements (in western markets) – of course employees are going to be worried. I’ve been warning for years that Apple’s position on privacy was nothing more than a marketing ploy, and now Apple employees, too, get a taste of their own medicine that they’ve been selling in China and various other totalitarian regimes.

One bad Apple

Dr. Neal Krawetz, one of the leading experts in the area of computer forensics research, digital photo analysis, and related topics, has penned a blog post in which he takes apart Apple’s recent announcement and the technology behind it. He actually has a lot of experience with the very problem Apple is trying to deal with, since he is the creator of FotoForensics, and files CSAM reports to the National Center for Missing and Exploited Children (NCMEC) every day. In fact, he files more reports than Apple, and knows all the ins and outs of all the technologies involved – including reverse-engineering Microsoft’s PhotoDNA, the perceptual hash algorithm NCMEC and Apple are using. The reason he had to reverse-engineer PhotoDNA is that NCMEC refused to countersign the NDA’s they wanted Krawetz to sign, eventually not responding to his requests altogether. Krawetz is one of the more prolific reporters of CSAM material (number 40 out of 168 in total in 2020). According to him, PhotoDNA is not as sophisticated as Apple’s and Microsoft’s documentation and claims make it out to be. Perhaps there is a reason that they don’t want really technical people looking at PhotoDNA. Microsoft says that the “PhotoDNA hash is not reversible”. That’s not true. PhotoDNA hashes can be projected into a 26×26 grayscale image that is only a little blurry. 26×26 is larger than most desktop icons; it’s enough detail to recognize people and objects. Reversing a PhotoDNA hash is no more complicated than solving a 26×26 Sudoku puzzle; a task well-suited for computers. The other major component of Apple’s system, an AI perceptual hash called a NeuralHash, is problematic too. The experts Apple cites have zero background in privacy or law, and while Apple’s whitepaper is “overly technical”, it “doesn’t give enough information for someone to confirm the implementation”. Furthermore, Krawetz “calls bullshit” on Apple’s claim that there is a 1 in 1 trillion error rate. After a detailed analysis of the numbers involved, he concludes: What is the real error rate? We don’t know. Apple doesn’t seem to know. And since they don’t know, they appear to have just thrown out a really big number. As far as I can tell, Apple’s claim of “1 in 1 trillion” is a baseless estimate. In this regard, Apple has provided misleading support for their algorithm and misleading accuracy rates. Krawetz also takes aim at the step where Apple manually reviews possible CP material by sending them from the device in question to Apple itself. After discussing this with his attorney, he concludes: The laws related to CSAM are very explicit. 18 U.S. Code § 2252 states that knowingly transferring CSAM material is a felony. (The only exception, in 2258A, is when it is reported to NCMEC.) In this case, Apple has a very strong reason to believe they are transferring CSAM material, and they are sending it to Apple — not NCMEC. It does not matter that Apple will then check it and forward it to NCMEC. 18 U.S.C. § 2258A is specific: the data can only be sent to NCMEC. (With 2258A, it is illegal for a service provider to turn over CP photos to the police or the FBI; you can only send it to NCMEC. Then NCMEC will contact the police or FBI.) What Apple has detailed is the intentional distribution (to Apple), collection (at Apple), and access (viewing at Apple) of material that they strongly have reason to believe is CSAM. As it was explained to me by my attorney, that is a felony. This whole thing looks, feels, and smells like a terribly designed system that is not only prone to errors, but also easily exploitable by people and governments with bad intentions. It also seems to be highly illegal, making one wonder why Apple were to put this out in the first place. Krawetz hints at why Apple is building this system earlier in this article: Apple’s devices rename pictures in a way that is very distinct. (Filename ballistics spots it really well.) Based on the number of reports that I’ve submitted to NCMEC, where the image appears to have touched Apple’s devices or services, I think that Apple has a very large CP/CSAM problem. I think this might be the real reason Apple is building this system.

Apple: critics of continuous iPhone photo scanning are “screeching voices of the minority”

Apple also addressed the hypothetical possibility of a particular region in the world deciding to corrupt a safety organization in an attempt to abuse the system, noting that the system’s first layer of protection is an undisclosed threshold before a user is flagged for having inappropriate imagery. Even if the threshold is exceeded, Apple said its manual review process would serve as an additional barrier and confirm the absence of known CSAM imagery. Apple said it would ultimately not report the flagged user to NCMEC or law enforcement agencies and that the system would still be working exactly as designed. After yesterday’s news and today’s responses from experts, here’s a recap: Apple is going to scan all photos on every iPhone to see if any of them match against a dataset of photos – that Apple itself hasn’t verified – given to them by the authorities of countries in which this is rolled out, with final checks being done by (third party) reviewers who are most likely traumatized, overworked, underpaid, and easily infiltrated. What could possibly go wrong? Today, Apple sent out an internal memo to Apple employees about this new scanning system. In it, they added a statement by Marita Rodriguez, executive director of strategic partnerships at the National Center for Missing and Exploited Children, and one of the choice quotes: I know it’s been a long day and that many of you probably haven’t slept in 24 hours. We know that the days to come will be filled with the screeching voices of the minority. Apple signed off on that quote. They think those of us worried about invasive technologies like this and the power backdoors like this would give to totalitarian regimes all over the world are the “screeching voices of the minority”. No wonder this company enjoys working with the most brutal regimes in the world.

An open letter against Apple’s privacy-invasive content scanning technology

A large number of security and privacy experts, legal experts, and more, in an open letter to Apple: On August 5th, 2021, Apple Inc. announced new technological measures meant to apply across virtually all of its devices under the umbrella of “Expanded Protections for Children”. While child exploitation is a serious problem, and while efforts to combat it are almost unquestionably well-intentioned, Apple’s proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products. The open letter contains tons of arguments, scenarios, and examples from experts about just how bad this technology is, and just how easily it can be abused.

The problem with perceptual hashes

Oliver Kuederle, who works with the image hashing technology used by Apple’s new technology that’s going to scan the photos on your iOS device continuously, explains that it is far, far from foolproof: Perceptual hashes are messy. The simple fact that image data is reduced to a small number of bits leads to collisions and therefore false positives. When such algorithms are used to detect criminal activities, especially at Apple scale, many innocent people can potentially face serious problems. My company’s customers are slightly inconvenienced by the failures of perceptual hashes (we have a UI in place that lets them make manual corrections). But when it comes to CSAM detection and its failure potential, that’s a whole different ball game. Needless to say, I’m quite worried about this. This is just one of the many, many problems with what Apple announced yesterday.

Apple’s plan to “Think Different” about encryption opens a backdoor to your private life

Apple has announced impending changes to its operating systems that include new “protections for children” features in iCloud and iMessage. If you’ve spent any time following the Crypto Wars, you know what this means: Apple is planning to build a backdoor into its data storage system and its messaging system. Child exploitation is a serious problem, and Apple isn’t the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy. Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor. Basically, Apple is going to scan your iCloud photo library, and compare cryptographic hashes of your photos to known photos containing child pornography. It’s hard to argue against this because it makes it seem as if you’re arguing against catching the sort of people that have such material. However, the issue with tools like this are not the ends – all of us are on the same side here – but the means. It’s more than obvious that this scanning is a gross invasion of privacy, but at the same time, you could easily argue that this is a bit of privacy we’d be willing to give up in order to aid in catching the worst elements of our society. The real problems stem from the fact that tools like this are simply never going to be foolproof. Software is incredibly unreliable, and while a random application crashing won’t ruin your life, an algorithm wrongfully labeling you as a pedophile most definitely will. On top of unintended consequences, malicious intent could be a major problem here too – what if some asshole wants to ruin your life, and sends you compromised photos, or otherwise sneaks them onto your device? And with Apple’s long history of working very closely with the most horrid regimes in the world, imagine what governments can do with a tool like this? On the ends that Apple is trying to get to here, we are all on the same side. The means to get there, however, need to be carefully considered.

Amazon just got Fakespot booted off Apple’s iOS App Store

Fakespot, known for its web browser extensions that try to weed out fake product reviews, suddenly no longer has an iPhone or iPad app — because Amazon sent Apple a takedown request, both Amazon and Fakespot confirm, and Apple decided to remove the app. The giant retailer says it was concerned about how a new update to the Fakespot app was “wrapping” its website without permission, and how that could be theoretically exploited to steal Amazon customer data. But Fakespot founder Saoud Khalifah tells The Verge that Apple abruptly removed the app today without any explanation. Apple didn’t respond to multiple requests for comment. Two abusive monopolists walk into a bar.