Editorial Archive

Corporatism and fascism are two sides of the same coin

Apple has removed WhatsApp and Threads from its app store in China, following an order from the country’s internet watchdog which cited national security concerns. ↫ Juliana Liu at CNN Over the recent months, as Apple had to change some of its business practices to comply with the European Union’s new Digital Markets Act, a still-ongoing process, Apple fans, spearheaded by John Gruber, have pushed Apple to leave the European Union. They argue that the minor inconvenience of complying with some basic consumer and market protection laws is too great of a deeply unfair financial sacrifice, and that leaving the EU makes more sense. Gruber also goes to bat hard for poor Facebook, arguing that company should leave the EU, too, over the DMA demanding Facebook respects users’ privacy. Apple itself, too, has been harshly attacking the European Union aggressively in the media. So anyway, today, Apple did what it has been doing for a very long time: bending over backwards for the totalitarian, genocidal regime in China. China tells Apple to remove applications, Apple complies. Every other of the sixteen hundred times Apple has complied with this horrible regime’s demands, Gruber always argued that all poor Apple can do is comply with local Chinese laws and demands, as leaving China over principles and morals would benefit nobody. So, we’re left with the rather peculiar situation where the response to some relatively minor consumer and market protection regulations is one of deep hostility, both from Apple as well as its PR attack dogs, whereas the response to the demands from one of the most brutal, totalitarian, genocidal regimes in human history is one of “that’s life”. Such is the way of the Apple corporatist: a democratically drawn up and widely popular law enacted by an incredibly popular government that causes some mild inconvenience for Apple is vilified with populist and nationalist anti-EU rhetoric, while the undemocratic, totalitarian decrees from a vicious genocidal dictator are met with effectively disinterested shrugs since those decrees don’t really inconvenience Apple. Corporatism and fascism are two sides of the same coin, from early 20th century Europe, through mid-20th century United States, to the megacorporations of today. Despite yet another decree from China that goes far further in nature than anything the DMA demands, we won’t be seeing any pushes from the Grubers of this world for Apple to leave China. We won’t be seeing copious amounts of malicious compliance from Apple. We won’t be treated to lengthy diatribes from Apple executives about how much they despise China and Chinese laws. All because China’s demands don’t harm Apple’s bottom line, but the DMA might. And for the corporatist, praying at the altar of money, the former is irrelevant, while the latter is sacrilege.

Setting up a YubiKey on Linux is a mess, and it really shouldn’t be

One of the things I’ve always wanted to experiment with on my computers is logging in and authenticating things like sudo requests with a hardware tool – a fingerprint reader, a smart card, or a USB hardware security device like a YubiKey. There’s really no solid reason for me to want this other than that it just feels cool and futuristic to me (yes, even in this, the year of our lord 2024). I have no state secrets, no secret Swiss bank accounts, no whistleblower material to protect, and my computers rarely leave the house – I just want it because it’s possible and cooler than typing in my password. Due to the flexibility and feature set of the YubiKey, I think it’s the best choice to go for. A no-name USB fingerprint reader would probably be ugly, cumbersome to position, and Linux support would be difficult to determine. A USB smart card reader would bring the same issues as the fingerprint reader, and combined with a smart card it seems like it’s just a Yubikey with extra steps. I do have to admit the idea of sliding a smart card in a slot and have it authorise you sounds really, really satisfying. Anyway, YubiKeys come in all shapes and sizes, but I want one of the USB-A ones with a fingerprint reader built-in, since I can plug it in at the bottom of my monitor, perfectly positioned to put my thumb on it to authenticate. This way, it’s easily accessible to be used to log into my desktop session, authorise sudo requests when I’m configuring things, log into websites with Firefox, and so on. But there’s a problem: setting up a YubiKey on Linux seems like it’s a huge ordeal. Just look a the official instructions on the YubiKey website, or the instructions on the Fedora website, my distribution of choice. That’s absolutely insane, and nobody should be expected to understand any of this nonsense to use what is being marketed as a consumer product. It’s important to note that this is not a hardware, software, or driver issue – all the necessary support is there, and Linux can make full use of the functionality tools like the YubiKey offers. The problem is that you’re expected to set this up manually, package by package, configuration file by configuration file, PAM module by PAM module. When I first looked into getting a YubiKey, I expected biometric and advanced authentication tools like these to be fully integrated into modern Linux distributions and desktop environments. I figured that once you plugged one of these tools into your PC, additional options would become available in GNOME’s or KDE’s user account settings, but apparently, this isn’t the case. This means that even if you manually set everything up using the official arcane incantations, your graphical user interface won’t be aware of any of that, and changing anything will mean you have to go through those official arcane incantations again. This is entirely unacceptable. The moment you plug in an an advanced hardware security tool like a YubiKey, GNOME and KDE should recognise it, and the settings, tools, and setup ‘wizards’ relevant to it should become available. All the hardware and software support is there – and in 2024, biometric and advanced security devices like these should not be so complicated and unforgiving to set up. Smart cards and fingerprint readers have been supported by Linux for literally decades. Why isn’t this easier? For now, I’m still in doubt about going through with buying a YubiKey. I definitely have the skills to go through with this whole insane setup process, but I really shouldn’t have to.

Open source is about more than just code

As some of the dust around the xz backdoor is slowly starting to settle, we’ve been getting a pretty clear picture of what, exactly, happened, and it’s not pretty. This is a story of the sole maintainer of a crucial building block of the open source stack having mental health issues, which at least partly contributes to a lack of interest in maintaining xz. It seems a coordinated campaign – consensus seems to point to a state actor – is then started to infiltrate xz, with the goal of inserting a backdoor into the project. Evan Boehs has done the legwork of diving into the mailing lists and commit logs of various projects and the people involved, and it almost reads like the nerd version of a spy novel. It involves seemingly fake users and accounts violently pressuring the original xz maintainer to add a second maintainer; a second maintainer who mysteriously seems to appear at around the same time, like a saviour. This second maintainer manages to gain the original maintainer’s trust, and within months, this mysterious newcomer more or less takes over as the new maintainer. As the new maintainer, this person starts adding the malicious code in question. Sockpuppet accounts show up to add code to oss-fuzz to try and make sure the backdoor won’t be detected. Once all the code is in place for the backdoor to function, more fake accounts show up to push for the compromised versions of xz to be included in Debian, Red Hat, Ubuntu, and possibly others. Roughly at this point, the backdoor is discovered entirely by chance because Andres Freund noticed his SSH logins felt a fraction of a second slower, and he wanted to know why. What seems to have happened here is a bad actor – again, most likely a state actor – finding and targeting a vulnerable maintainer, who, through clever social engineering on both a personal level as well as the project level, gained control over a crucial but unexciting building block of the open source stack. Once enough control and trust was gained, the bad actor added a backdoor to do… Well, something. It seems nobody really knows yet what the ultimate goal was, but we can all make some educated guesses and none of them are any good. When we think of vulnerabilities in computer software, we tend to focus on bugs and mistakes that unintentionally create the conditions wherein someone with malicious intent can do, well, malicious things. We don’t often consider the possibility of maintainers being malicious, secretly adding backdoors for all kinds of nefarious purposes. The problem the xz backdoor highlights is that while we have quite a few ways to prevent, discover, mitigate, and fix unintentional security holes, we seem to have pretty much nothing in place to prevent intentional backdoors placed by trusted maintainers. And this is a real problem. There are so many utterly crucial but deeply boring building blocks all over the open source stacks pretty much the entire computing world makes use of that it has become a meme, spearheaded by xkcd’s classic comic. The weakness in many of these types of projects is not the code, but the people maintaining that code, most likely through no fault of their own. There are so many things life can throw at you that would make you susceptible to social engineering – money problems, health problems, mental health issues, burnout, relationship problems, god knows what else – and the open source community has nothing in place to help maintainers of obscure but crucial pieces of infrastructure deal with problems like these. That’s why I’m suggesting the idea of setting up a foundation – or whatever legal entity makes sense – that is dedicated to helping maintainers who face the kinds of problems like the maintainer of xz did. A place where a maintainer who is dealing with problems outside of the code repository can go to for help, advice, maybe even financial and health assistance if needed. Even if all this foundation offers to someone is a person to talk to in confidence, it might mean the difference between burning out completely, or recovering at least enough to then possibly find other ways to improve one’s situation. If someone is burnt-out or has a mental health crisis, they could contact the foundation, tell their story, and say, hey, I need a few months to recover and deal with my problems, can we put out a call among already trusted members of the open source community to step in for me for a while? Keep the ship steady as she goes without rocking it until I get back or we find someone to take over permanently? This way, the wider community will also know the regular, trusted maintainer is stepping down for a while, and that any new commits should be treated with extra care, solving the problem of some unknown maintainer of an obscure but important package suffering in obscurity, the only hints found in the low-volume mailing list well after something goes wrong. The financial responsibility for such a safety net should undoubtedly be borne by the long list of ultra-rich megacorporations who profit off the backs of these people toiling away in obscurity. The financial burden for something like this would be pocket change to the likes of Google, Apple, IBM, Microsoft, and so on, but could make a contribution to open source far greater than any code dump. Governments could probably be involved too, but that will most likely open up a whole can of worms, so I’m not sure if that would be a good idea. I’m not proposing this be some sort of glorified ATM where people can go to get some free money whenever they feel like it. The goal should be to help people who form crucial cogs in the delicate machinery of computing to live healthy, sustainable lives so their code and contributions to the community don’t get compromised. This

Desktop Linux has a Firefox problem

There’s no denying that the browser is the single-most important application on any operating system, whether that be on desktops and laptops or on mobile devices. Without a capable, fast, and solid browser, the usefulness of an operating system decreases exponentially, to the point where I’m quite sure virtually nobody’s going to use an operating system for regular, normal use if it doesn’t have a browser. Having an at least somewhat useable browser is what elevates an operating system from a hobby toy to something you could use for more than 10 minutes as a fun novelty. The problem here is that making a capable browser is actually incredibly hard, as the browser has become a hugely capable platform all of its own. Undertaking the mammoth task of building a browser from scratch is not something a lot of people are interested in – save for the crazy ones – made worse by the fact that competing with the three remaining browser engines is basically futile due to market consolidation and monopolisation. Chrome and its various derivatives are vastly dominant, followed by Safari on iOS, if only because you can’t use anything else on iOS. And then there’s Firefox, trailing far behind as a distant third – and falling. This is the environment desktop Linux distributions find themselves in. For the longest time now, desktop Linux has relied virtually exclusively on shipping Firefox – and the Mozilla suite before that – as their browser, with some users opting to download Chrome post-install. While both GNOME and KDE nominally invest in their own two browsers, GNOME Web and Falkon, their uptake is limited and releases few and far between. For instance, none of the major Linux distributions ship GNOME Web as their default browser, and it lacks many of the features users come to expect from a browser. Falkon, meanwhile, is updated only sporadically, often going years between releases. Worse yet, Falkon uses Chromium through QtWebEngine, and GNOME Web uses WebKit (which are updated separately from the browser, so browser releases are not always a solid metric!), so both are dependent on the goodwill of two of the most ruthless corporations in the world, Google and Apple respectively. Even Firefox itself, even though it’s clearly the browser of choice of distributions and Linux users alike, does not consider Linux a first-tier platform. Firefox is first and foremost a Windows browser, followed by macOS second, and Linux third. The love the Linux world has for Firefox is not reciprocated by Mozilla in the same way, and this shows in various places where issues fixed and addressed on the Windows side are ignored on the Linux side for years or longer. The best and most visible example of that is hardware video acceleration. This feature has been a default part of the Windows version since forever, but it wasn’t enabled by default for Linux until Firefox 115, released only in early July 2023. Even then, the feature is only enabled by default for users of Intel graphics – AMD and Nvidia users need not apply. This lack of video acceleration was – and for AMD and Nvidia users, still is – a major contributing factor to Linux battery life on laptops taking a serious hit compared to their Windows counterparts. The road to even getting here has been a long, hard, and bumpy one. For years and years now, getting video acceleration to work on Firefox for Linux was complicated and unreliable, with every release of the browser possibly changing what flags you needed to set, and sometimes it would just stop working for several releases in a row altogether, no matter what you did. There’s a venerable encyclopaedia of forum messages, blog posts, and website articles with outdated instructions and Hail Mary-like suggestions for users trying to get it to work. Conventional wisdom would change with every release, and keeping track of it all was a nightmare. It’s not just hardware accelerated video decoding. Gesture support has taken much longer to arrive on the Linux version than it did on the Windows version – things like using swipes to go back and forward, or pinch to zoom on images. Similarly, touchscreen support took a longer time to arrive on the Linux version of Firefox, too. Often, such features could be enabled with about:config incantations for years before becoming enabled by default, at least, but that’s far from an ideal situation. With desktop Linux trailing both Windows and macOS in popularity, there’s nothing unexpected or inherently malicious about this, and the point of the previous few paragraphs is not to complain about the state of Firefox for Linux or to suggest Mozilla transfers precious resources from the Windows and macOS versions to the Linux version. While I obviously wouldn’t complain if they did so, it wouldn’t make much sense. The real reason I’m highlighting these issues is that if Firefox for Linux is already treated as a third wheel today, with Mozilla’s current financial means and resources, what would happen if Mozilla saw a drastic reduction in its financial means and resources? Firefox is not doing well. Its market share has dropped radically over the years, and now sits at a meagre 3% on desktops and laptops, and a negligible 0.5% on mobile. Chrome and to a lesser extent Safari have trampled all over the venerable browser, to a point where it’s effectively an also-ran for Linux/BSD users, and a few more nerds on other platforms. I’m not saying this to disparage those who use Firefox – I’m one of them – but to underline just how dire Firefox’ current market position really is. This shrinking market share must already be harming the development and future prospects of Firefox, especially if the slide continues. The declining market share is far from the biggest problem, however. The giant sword of Damocles dangling above Firefox’ head are Mozilla’s really odd and lopsided revenue sources. As most of us are probably aware, Mozilla makes most of

What’s Happening with User Interfaces?

Like many of you, I've been watching the big changes in user interfaces over the past few years, trying to make sense of them all. Is there a common explanation for the controversies surrounding the Windows 8 UI and Unity? Where do GNOME 3, KDE, Cinnamon, and MATE fit in? This article offers one view.

U.S. Voting Technology: Problems Continue

In the United States, state and local authorities are in charge of voting and the country uses more than a half dozen different voting technologies. As a result, the country can't guarantee that it accurately counts national votes in a timely fashion. This article discusses the problem and potential solutions to the U.S. voting dilemma.

Tech Company Futures, pt. 2

My previous article analyzed some tech companies and their prospects: Microsoft, Intel, HP, Dell, Oracle, Apple, and Google. This article discusses IBM, Amazon, Yahoo!, Cisco, and BMC Software. The goal is to spark a useful discussion. What is your opinion of these companies? Do they have viable strategies for the future?

How Much Should an OS Vendor Own?

I was reading today about how Linux Mint developers altered the Banshee music player source code to redirect affiliate revenue from Amazon music orders to them instead of Banshee. They've reportedly made less than $4, which has caused a kerfluffle among those paying attention to that corner of the world. But it raises a larger point that has been swirling around for a couple of decades: an OS vendor has a lot of power to influence, and even monetize their user base. Where should they draw the line?

The Personal Computer Is Dead

The PC is dead. Rising numbers of mobile, lightweight, cloud-centric devices don’t merely represent a change in form factor. Rather, we’re seeing an unprecedented shift of power from end users and software developers on the one hand, to operating system vendors on the other--and even those who keep their PCs are being swept along. This is a little for the better, and much for the worse.

Tech Stocks!

My previous article described how you can use your tech knowledge to profit from the stock market -- if you combine it with financial analysis and careful research. This article analyzes several tech stocks. The goal is to start a useful discussion. What is your opinion of these companies? Even if you don't invest, this matters if you are in employed in IT. You're betting your career on the companies in whose products you specialize! You don't want to pick losers.

How Adobe Flash Lost Its Way

Despite early successes on the Web, the latter years of Flash have been a tale of missed opportunities, writes Fatal Exception's Neil McAllister. 'The bigger picture is that major platform vendors are increasingly encouraging developers to create rich applications not to be delivered via the browser, but as native, platform-based apps. That's long been the case on iOS and other smartphone platforms, and now it's starting to be the norm on Windows. Each step of the way, Adobe is getting left behind,' McAllister writes. 'Perhaps Adobe's biggest problem, however, is that it's something of a relic as developer-oriented vendors go. How many people have access to the Flash runtime is almost a moot point, because Adobe doesn't make any money from the runtime directly; it gives it away for free. Adobe makes its money from selling developer tools. Given the rich supply of free, open source developer tools available today, vendors like that are few and far between. Remember Borland? Or Watcom?'