Thom Holwerda Archive

Build your own SPARC workstation with QEMU and Solaris

A great intro to a classic platform by way of emulation and optionally even adapting a real physical keyboard: Back in the late 80s and through the 90s, Unix workstations were super powerful, super cool, and super expensive. If you were making 3D graphics or developing applications, you wanted a high-performance workstation and Sun made some of the best ones. But unless you worked for a huge company, university, or government, they were probably too expensive. More than twenty years later, we have much more powerful and affordable computers, so let’s emulate the old systems and see what it was like to run some of the coolest computers you could buy in the 90s. This is another in the series from the same author as the recently linked virtual NeXT machine, that also includes an entry for a virtual BeBox to experience BeOS.

Apple’s child protection features spark concern within its own ranks

Apple employees have flooded an Apple internal Slack channel with more than 800 messages on the plan announced a week ago, workers who asked not to be identified told Reuters. Many expressed worries that the feature could be exploited by repressive governments looking to find other material for censorship or arrests, according to workers who saw the days-long thread. Past security changes at Apple have also prompted concern among employees, but the volume and duration of the new debate is surprising, the workers said. Some posters worried that Apple is damaging its leading reputation for protecting privacy. It’s a complete 180 from Apple’s behaviour and statements (in western markets) – of course employees are going to be worried. I’ve been warning for years that Apple’s position on privacy was nothing more than a marketing ploy, and now Apple employees, too, get a taste of their own medicine that they’ve been selling in China and various other totalitarian regimes.

App store competition targeted by bipartisan senate bill

Their legislation would bar the companies from certain conduct that would tend to force developers to use their app stores or payment systems. It also would obligate the companies to protect app developers’ rights to tell consumers about lower prices and offer competitive pricing. It would effectively allow apps to be loaded onto Apple users’ devices outside of the company’s official app store. There’s so much movement on this front, I highly doubt Apple and Google will be able to stop it. This is one of the very, very rare cases where both sides of the political spectrum seem to somewhat agree, and I hope they can make it stick. It’s definitely not enough, but it’s a step in the right direction. I’m an extremist – all source code should be freely available (not necessarily open source – just viewable), to give consumers and society as a whole the ability to ensure they’re not being spied on, lied to, or endangered by foreign entities or corporate trickery. If copyright is good enough for writers, artists, and musicians, it’s damn well good enough for programmers. With how vital computers and software have become – woven into the fabric of our society – we as people should be able to see and check what those threads are doing and where they’re going to and coming from. Corporations have shown time and time again that they are not trustworthy entities and that they do not have society’s best interests at heart, and we need tools to bring the balance of power back – black boxes of code are dangerous.

Code written for Windows 3.1 still works well today

So imagine my surprise when I dug around in a quarter-century-old archive to find a .zip file containing something that purported to be the original executable of Labyrinth. Surely such an ancient piece of code – written for Windows 3.1 – wouldn’t launch? Well, after a bit of fiddling with the Windows compatibility settings, I was shocked – and extremely pleased – to see that, yes, it most certainly did. It shouldn’t be surprising that a piece of good Windows code from 30 years ago still runs on Windows 10 today, and yet, it always is.

Essence: an new desktop operating system

An operating system I’ve been writing since ~June 2017. Although it’s a long shot (and very optimistic), I ultimately intend it to replace Linux and Windows as a desktop operating system. Very optimistic, but there’s quite a few things here already. The code is on gitlab, where you can find more information, too.

MorphOS 3.15 gets ram-handler bugfix

MorphOS 3.15 ram-handler contains a bug that when unlucky may results in the RAM: root directory to appear to contain many duplicate entries. Multiple users had reported this over the years, but until recently the root cause of this issue had eluded us. Due to recent developments, the bug has finally been located and fixed (thanks to AngryTom for help!). Fixed ram-handler will be released as the part of the future MorphOS 3.16 release. Meanwhile you can install the following patch that fixes the problem for MorphOS 3.15. I know this isn’t a major new release or anything, but it’s rather rare and interesting to see a small, standalone update like this being release for a small, alternative operating system. Usually, these get rolled into major new releases or nightlies, so I found this interesting.

Build your own NeXT with a virtual machine

In 1985 Steve Jobs resigned from Apple and founded NeXT Inc. in order to build the NeXT Computer. It was ahead of its time and had amazing features thanks to the NeXTSTEP operating system, most famously used at CERN by Sir Tim Berners-Lee to create the World Wide Web. NeXTSTEP later became OPENSTEP and when Apple acquired NeXT in 1997, they used it as the basis for Mac OS X and iOS. If you’ve done any Mac or iOS programming, you’ve seen the echoes of NeXTSTEP in the type names – NSObject, NSString, NSDictionary, and many others all come directly from NeXT (NS = NeXTSTEP). These computers cost about as much as a new car when they first came out, so they were out of reach for most people. What was it like to use a top of the line system in the early 90s? Let’s build our own and find out! Exactly as it says on the tin. A fun few hours.

Why does the Steam Deck run Linux? Blame Windows

Valve’s “Steam Deck” handheld PC has caused quite a stir among PC gaming geeks, but the biggest shakeup might not be its Nintendo Switch-like form factor. The software running inside of it is the real surprise. Why does the Steam Deck run Linux? Blame Windows. The Steam Deck and the software inside of it are the culmination of a nearly decade-long “hedging strategy” embarked upon by Valve chief Gabe Newell and company many moons ago, when Microsoft tried exerting more control over developers with Windows 8. But it’s also the next phase of Valve’s escape plan. Also, Windows is simply a terrible choice for the Steam Deck. The base model only has 64GB of storage, and Windows 10 will easily take up two-thirds of that.

One bad Apple

Dr. Neal Krawetz, one of the leading experts in the area of computer forensics research, digital photo analysis, and related topics, has penned a blog post in which he takes apart Apple’s recent announcement and the technology behind it. He actually has a lot of experience with the very problem Apple is trying to deal with, since he is the creator of FotoForensics, and files CSAM reports to the National Center for Missing and Exploited Children (NCMEC) every day. In fact, he files more reports than Apple, and knows all the ins and outs of all the technologies involved – including reverse-engineering Microsoft’s PhotoDNA, the perceptual hash algorithm NCMEC and Apple are using. The reason he had to reverse-engineer PhotoDNA is that NCMEC refused to countersign the NDA’s they wanted Krawetz to sign, eventually not responding to his requests altogether. Krawetz is one of the more prolific reporters of CSAM material (number 40 out of 168 in total in 2020). According to him, PhotoDNA is not as sophisticated as Apple’s and Microsoft’s documentation and claims make it out to be. Perhaps there is a reason that they don’t want really technical people looking at PhotoDNA. Microsoft says that the “PhotoDNA hash is not reversible”. That’s not true. PhotoDNA hashes can be projected into a 26×26 grayscale image that is only a little blurry. 26×26 is larger than most desktop icons; it’s enough detail to recognize people and objects. Reversing a PhotoDNA hash is no more complicated than solving a 26×26 Sudoku puzzle; a task well-suited for computers. The other major component of Apple’s system, an AI perceptual hash called a NeuralHash, is problematic too. The experts Apple cites have zero background in privacy or law, and while Apple’s whitepaper is “overly technical”, it “doesn’t give enough information for someone to confirm the implementation”. Furthermore, Krawetz “calls bullshit” on Apple’s claim that there is a 1 in 1 trillion error rate. After a detailed analysis of the numbers involved, he concludes: What is the real error rate? We don’t know. Apple doesn’t seem to know. And since they don’t know, they appear to have just thrown out a really big number. As far as I can tell, Apple’s claim of “1 in 1 trillion” is a baseless estimate. In this regard, Apple has provided misleading support for their algorithm and misleading accuracy rates. Krawetz also takes aim at the step where Apple manually reviews possible CP material by sending them from the device in question to Apple itself. After discussing this with his attorney, he concludes: The laws related to CSAM are very explicit. 18 U.S. Code § 2252 states that knowingly transferring CSAM material is a felony. (The only exception, in 2258A, is when it is reported to NCMEC.) In this case, Apple has a very strong reason to believe they are transferring CSAM material, and they are sending it to Apple — not NCMEC. It does not matter that Apple will then check it and forward it to NCMEC. 18 U.S.C. § 2258A is specific: the data can only be sent to NCMEC. (With 2258A, it is illegal for a service provider to turn over CP photos to the police or the FBI; you can only send it to NCMEC. Then NCMEC will contact the police or FBI.) What Apple has detailed is the intentional distribution (to Apple), collection (at Apple), and access (viewing at Apple) of material that they strongly have reason to believe is CSAM. As it was explained to me by my attorney, that is a felony. This whole thing looks, feels, and smells like a terribly designed system that is not only prone to errors, but also easily exploitable by people and governments with bad intentions. It also seems to be highly illegal, making one wonder why Apple were to put this out in the first place. Krawetz hints at why Apple is building this system earlier in this article: Apple’s devices rename pictures in a way that is very distinct. (Filename ballistics spots it really well.) Based on the number of reports that I’ve submitted to NCMEC, where the image appears to have touched Apple’s devices or services, I think that Apple has a very large CP/CSAM problem. I think this might be the real reason Apple is building this system.

Google working to bring the full Chrome browser to Fuchsia OS

Every good operating system needs a web browser, especially as more and more apps move to the web. To that end, Google is preparing to bring the full Google Chrome browser experience to Fuchsia OS. This was inevitable, of course. As the article notes, Fuchsia already has the Chrome engine to display web content if needed, and now they are bringing the whole actual browser over as well. Just another step in the long journey to replace the underpinnings of Android and Chrome OS.

Apple: critics of continuous iPhone photo scanning are “screeching voices of the minority”

Apple also addressed the hypothetical possibility of a particular region in the world deciding to corrupt a safety organization in an attempt to abuse the system, noting that the system’s first layer of protection is an undisclosed threshold before a user is flagged for having inappropriate imagery. Even if the threshold is exceeded, Apple said its manual review process would serve as an additional barrier and confirm the absence of known CSAM imagery. Apple said it would ultimately not report the flagged user to NCMEC or law enforcement agencies and that the system would still be working exactly as designed. After yesterday’s news and today’s responses from experts, here’s a recap: Apple is going to scan all photos on every iPhone to see if any of them match against a dataset of photos – that Apple itself hasn’t verified – given to them by the authorities of countries in which this is rolled out, with final checks being done by (third party) reviewers who are most likely traumatized, overworked, underpaid, and easily infiltrated. What could possibly go wrong? Today, Apple sent out an internal memo to Apple employees about this new scanning system. In it, they added a statement by Marita Rodriguez, executive director of strategic partnerships at the National Center for Missing and Exploited Children, and one of the choice quotes: I know it’s been a long day and that many of you probably haven’t slept in 24 hours. We know that the days to come will be filled with the screeching voices of the minority. Apple signed off on that quote. They think those of us worried about invasive technologies like this and the power backdoors like this would give to totalitarian regimes all over the world are the “screeching voices of the minority”. No wonder this company enjoys working with the most brutal regimes in the world.

An open letter against Apple’s privacy-invasive content scanning technology

A large number of security and privacy experts, legal experts, and more, in an open letter to Apple: On August 5th, 2021, Apple Inc. announced new technological measures meant to apply across virtually all of its devices under the umbrella of “Expanded Protections for Children”. While child exploitation is a serious problem, and while efforts to combat it are almost unquestionably well-intentioned, Apple’s proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products. The open letter contains tons of arguments, scenarios, and examples from experts about just how bad this technology is, and just how easily it can be abused.

The problem with perceptual hashes

Oliver Kuederle, who works with the image hashing technology used by Apple’s new technology that’s going to scan the photos on your iOS device continuously, explains that it is far, far from foolproof: Perceptual hashes are messy. The simple fact that image data is reduced to a small number of bits leads to collisions and therefore false positives. When such algorithms are used to detect criminal activities, especially at Apple scale, many innocent people can potentially face serious problems. My company’s customers are slightly inconvenienced by the failures of perceptual hashes (we have a UI in place that lets them make manual corrections). But when it comes to CSAM detection and its failure potential, that’s a whole different ball game. Needless to say, I’m quite worried about this. This is just one of the many, many problems with what Apple announced yesterday.

Airyx aims to bring some macOS to BSD

Airyx is a new open-source desktop operating system that aims to provide a similar experience and compatibility with macOS on x86-64 systems. It builds on the solid foundations of FreeBSD, existing open source packages in the same space, and new code to fill the gaps. Airyx aims to feel sleek, stable, familiar and intuitive, handle your daily tasks, and provide as much compatibility as possible with the commercial OS that inspired it. An ambitious but interesting effort, that seems to align quite well with helloSystem.

Personal computing on an Amiga in 2021

Solène created a week-long personal computing challenge around old computers. I chose to use an Amiga for the week. In this issue I write about my experience, and what modern computing lost when Commodore died. I also want to show some of the things you can do with an Amiga or even an emulator if you’d like to try. I’ve tried to get into the Amiga-like operating systems – MorphOS, AROS, Amiga OS 4 – but the platform just doesn’t suit me. I find them convoluted, incomprehensible, and frustratingly difficult to use. Not that it matters – I’m not here to ruin the Amiga community’s party – but if they want to sustain that community instead of having it die out as their user numbers dwindle due to old age, they might want to consider making their operating systems a little less… Obtuse.

Apple’s plan to “Think Different” about encryption opens a backdoor to your private life

Apple has announced impending changes to its operating systems that include new “protections for children” features in iCloud and iMessage. If you’ve spent any time following the Crypto Wars, you know what this means: Apple is planning to build a backdoor into its data storage system and its messaging system. Child exploitation is a serious problem, and Apple isn’t the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy. Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor. Basically, Apple is going to scan your iCloud photo library, and compare cryptographic hashes of your photos to known photos containing child pornography. It’s hard to argue against this because it makes it seem as if you’re arguing against catching the sort of people that have such material. However, the issue with tools like this are not the ends – all of us are on the same side here – but the means. It’s more than obvious that this scanning is a gross invasion of privacy, but at the same time, you could easily argue that this is a bit of privacy we’d be willing to give up in order to aid in catching the worst elements of our society. The real problems stem from the fact that tools like this are simply never going to be foolproof. Software is incredibly unreliable, and while a random application crashing won’t ruin your life, an algorithm wrongfully labeling you as a pedophile most definitely will. On top of unintended consequences, malicious intent could be a major problem here too – what if some asshole wants to ruin your life, and sends you compromised photos, or otherwise sneaks them onto your device? And with Apple’s long history of working very closely with the most horrid regimes in the world, imagine what governments can do with a tool like this? On the ends that Apple is trying to get to here, we are all on the same side. The means to get there, however, need to be carefully considered.

ReactOS improves amd64 support

The latest ReactOS newsletter has been published. Timo Kreuzer (tkreuzer) worked hard on various parts of the kernel and HAL, fixing issues here and there. Structured Exception Handling (SEH) support for the amd64 architecture was finished, various bugs around the kernel are fixed. A major issue with interrupt handling in HAL was also fixed in May, which finally allowed a semi-stable boot in a virtual environment. There’s also work being done on support for multiple monitors, improved support for SMP, and more.

Google drops supports for Google services for Android 2.3.7 and lower

As part of our ongoing efforts to keep our users safe, Google will no longer allow sign-in on Android devices that run Android 2.3.7 or lower starting September 27, 2021. If you sign into your device after September 27, you may get username or password errors when you try to use Google products and services like Gmail, YouTube, and Maps. Android 2.3.7 was released on 21 September, 2011. That’s ten years of support. I think that’s fair.