Every so often people yearn for a lost (1980s or so) era of ‘single user computers’, whether these are simple personal computers or high end things like Lisp machines and Smalltalk workstations. It’s my view that the whole idea of a 1980s style “single user computer” is not what we actually want and has some significant flaws in practice.
↫ Chris Siebenmann
I think the premise of this entire article is flawed, and borders on being a strawman argument. I honestly don’t think there’s many people out there who genuinely and seriously want to use an ’80s home computer for all their computing tasks, but this article seems to think that there are. Virtually every single person expressing interest in and a desire for classic computers does so from a point of nostalgia, as a learning experience, or as a hobby. They’re definitely not interested in using any of those ’80s machine to do their banking or to collaborate with their colleagues.
Additionally, the problems and issues people have with modern computing platforms is not that they are too complex, but that they are no longer designed with the user in mind. Windows, macOS, iOS; they’re all first and foremost designed to extract money from you through ads, upsells, nag screens, and similar anti-user features, and it’s those things that people are sick of. Coincidentally, they are all things we didn’t have to deal with back in the ’80s and ’90s. In other words, remove the user-hostility from modern operating systems, and people wouldn’t complain about them so much.
Which seems rather obvious, doesn’t it?
It’s why using a Linux desktop like Fedora is such a breath of fresh air. There’s no upsells for cloud storage or streaming services, no restrictions on what I can and cannot install to protect some multitrillion euro company’s revenue streams, no ads and nag screens infesting my operating system – it’s just an operating system waiting for me to tell it what it do, and then it does it. It’s wild how increasingly revolutionary that’s becoming.
Whenever I am forced to interact with Windows 11 or whatever the current version of macOS is, I feel such a profound and deep sadness for what they’ve become, and it seems only natural to me that this sadness is fueling a longing for back when these systems weren’t so user-hostile.
I think I read that article very differently…
What I got from it is that much of a modern computer runs on the premise that multiple users share the device. They are not designed for a single user (even your phone is a multi-user device with profiles and the like).
Also, that the single user no longer has control and access in the same way they once did due to to complexity. I remember back in the MacOS days I’d spend hours playing with what Extentions Manager/Conflict Catcher, either to resolve a conflict or to try to optimise performance.
If that didn’t acheive what I wanted, I’ll use ResEdit and make changes to the application itself.
Don’t get me wrong, that involved a Lot of understanding, But modern machines make that degree of control impossible. Even in Linux. extensions/demons run on your system with the user having little/no awareness of the fact. When I booted up MacOS 7/8/9, I could see each and every extension loaded. There were no “secret” apps or services.
This can go on for ages.
My personal daily driver is FreeBSD with Openbox, but I still need to boot Windows to be able to operate my photo scanners with full functionality. I also finally managed to overcome the learning curve and migrated from Photoshop to GIMP. For work, I do most from FreeBSD but sometimes (like now) boot Windows 11 to be able to replicate my colleagues’ experiences.
I am happy with my FreeBSD experience: start from zero, add as you go. I don’t use webcams or bluetooth, so I’ve never ever loaded the related drivers/kernel modules. Openbox with a few plug ins gives me all the functionality I need in terms of keyboard shortcuts, widgets and window management with minimum resource usage. It is the most responsive computer I own. I know what it is running at all times. Battery life is way better than on Windows (be it 7, 10 or 11) on my ThinkPad W530.
It is mad how things went downhill with this idea of reinventing the wheel all the time.
– GIMP is now at version 3. At least as tested under Windows, a huge regression in terms of functionality. Clicking a blank space in the scroll bars used to scroll ‘one page of content’ and now it moves the bar to the absolute position of the click in relation to the travel of the bar. It drives me mad when I am trying to make sure I don’t overshoot when cleaning dust out of 25000×20000 pixels 8×10 film scans.
– Windows 11’s Task Manager is absolutely bonkers. What is the deal with the HUGE labels on the left vs sensible labeled tabs on the side? Reduce the window to minimize screen real estate waste and the labels disappeared and the icons are the most generic pointless icons ever.
– Let’s not go about how much effort it is to clean Windows up to the absolute minimum of data leakage.
– The user experience for most OSs and window managers is just an inconsistent, inconsiderate and inefficient mess of useless animations, nonsensical semi-transparencies, icons that don’t say anything and ‘each developer knows best’ approach to UI/UX leading to applications that behave differently, use inconsistent widgets.
– To hell with hamburger menus.
– My Canon PRO-1000 photo printer worked for 2 years via wifi flawlessly. I never lost a single print job. Since October, it just doesn’t work properly and I don’t know if the problem is Windows or a driver. There are no logs, the error messages don’t say anything and I need to run a 5m USB cable across the room. =(
– Why change everything ALL THE TIME?
Windows peaked at 2000. It was a clean, consistent, glorious experience. A little bit better windows management, and it would be the gold.
From Mac OS classic, I miss the levels of spacial memory it had. Close a window? Next time you would open it, it would appear at the same place, be it for the OS or applications. Move icons around, customize a view? It would remember.
I don’t know… I used to look forward to the new versions of things because of the new features and improvements, and now I fear every time an update is coming because I don’t know what will break or what feature will be removed.
Computers are tools. They should do the job we need and period. I still type on a 1987 model M keyboard. How many keyboards must be produced per person in the world? Are we really creating any wealth by digging stuff that already exists from the ground and using it for 3 years? Couldn’t we get our developers working on new exciting things rather than redesigning the start menu for the 5000th time?
/rant
Shiunbird,
Software companies have a major existential problem. Software and operating systems that has a fixed target in mind will eventually reach that target, or at least get close enough that improvements become increasingly marginal. This translates to poor sales. Customers can’t be bothered to replace something that’s already working even to the point where microsoft had to force feed windows upgrades. For better or worse planned obsolescence is the business solution to this.
Consider a home appliance manufacturer. If they made their products too reliable, more repairable, etc, it would only dig into future sales. Long term, they have to have a strategy to keep customers coming back. Product breakages are positively correlated with future profits. This realization is really bad for our pocket books and our planet, but under capitalism planned obsolescence is an advantage.
It’s the same with software. Microsoft/google/apple/adobe etc all know that they can never just finish a product even though most customers would love for OS to stop changing once it gets good. The problems created yesterday create incentives to upgrade today and tomorrow. For tech giants who already have billions of customers, perfection = financial ruin because there’s only so many new customers that can sell to. They generally won’t come out and say it, but it’s absolutely a factor in why we’re seeing high levels of software enshifitication. These companies know the crap they’re putting out doesn’t please customers, but it’s a secondary goal that takes back stage to profits.
The financial motivations don’t apply to FOSS as much. I would argue there’s a bit of copy cat syndrome. When microsoft came out with drastic UI changes for metro, it didn’t matter that users hated it, several FOSS projects followed suit with flat interfaces, controls that are harder to use & see. Ironically enough the relative lack of resources going into XFCE could actually be beneficial because developers have to use their time wisely and don’t have as much time to chase fads. I wish of wish more developers would take this to hard. Although I confess I’m probably guilty of it too. When your the new guy on a project there can be a strong temptation to redesign everything around your own preferences & style.
Nobody is going to convince me that WIMP is not the best GUI model in a desktop/laptop computer.
+1 for WIMP. but there are 2 huge threats to that:
1. Hamburger menu, This doesn’t belong on a desktop that has a standard menubar. It especially should NOT be placed in an arbitrary location in the window. I would be ok if this was used on a phone or apps without a menubar, but it would have to be a standardized alternative to menubar with similarities between apps and direct correlation to what would be in the menubar.
2.The Ribbon. I haven’t been able to find what I need in Microsoft office since Office2004. A text search in an App or google search for a feature is no substitute to just browsing though the options in the menubar.
In both cases.. I can’t memorize all those icons! I haven’t used yet. Just give me the text descriptions in a menu. And even though we have more space on screen these days, why put those stupid floppy disk save and copy/paste icons in a ribbon? Just put copy/paste along with the keyboard shortcut in the File menubar.
mattsaved ,
I agree with your examples.
I struggle with modern scroll bars and window resizing. I hate how small of a target these have become. Screens have become so wide for the sake of hollywood movies, but the average document and website doesn’t benefit from it. We end up having a lot of horizontal whitespace. So when we give these scrollbars tiny hitboxes and even make them disappear, why does that need to happen? It is an accessibility disaster. And for what, making room for more even whitespace? Meanwhile my bank decommissioned their old desktop website in favor of a low density one designed for mobiles, which means even more unused pixels on the desktop. Twitter, facebook, google, youtube, it’s all the same…we sacrificed scrollbars to make room for more whitespace. It’s quite the feat for modern designers to make accessibility worse while simultaneously decreasing information density as screens have gotten more pixels.
I TOTALLY agree with you on the scrollbars.
Indeed. I regard the release of OS X Lion and its introduction of vanishing scrollbars and similar UI “innovations” as the beginning of the end. Sort of like “patient zero” in a zombie movie.
“They’re definitely not interested in using any of those ’80s machine to do their banking or to collaborate with their colleagues.”
Aside from downloading transactions, ’80s machines worked very well for personal banking, but the bigger issue is the point about collaboration. On an ’80s machine, or even a ’90s machine, a computer user might sit down and *focus* on writing a paper or creating an image or doing some other act of creation. It’s harder to do that in a world of constant messaging and notifications intruding on the process. Many people have no idea what it is like to create something by themselves as the product of focused, uninterrupted, individual work.
That’s how enshittification works, once you’ve got users and partners locked in, you start maximizing how much profit you can extract from users.
This is why even Windows users like me who have no intention of moving to Desktop Linux are watching what happens in SteamOS land: SteamOS is an insurance I’d rather not have to cash (since it’s not 100% compatible with Windows), but it’s an insurance I am glad that exists in case Windows becomes insufferable.
The only way out of the current duopoly is for win32 to become a common de facto standard.
I liked the early 2000s when Silicon Graphics offered dualseat with the Duo option on Octane2: https://www.infania.net/misc1/sgi_techpubs/techpubs/007-4506-001.pdf One workstation, two displays, two keyboards, two mices.
Stop equating something you personally don’t like with it being “user hostile.”
Modern systems are orders of magnitude more accessible to larger numbers of users that they ever were in the 80s and 90s.
The radical shift is that those users are not geeks like they needed to be back then, because those old computing systems were not particularly user friendly and require some level of “enthusiasm” to put up with all their warts.
Now computing is a commodity and used by the public at large as another tool to get ish done. Whereas back in the day, a lot of people dealt with computing as a tool to geek out about the tool itself.
It is what it is. Geeks are no longer the main audience of the tool/product when it comes to computing.
I get what he’s saying. The ad obsession is actively user hostile. Resetting user preferences to more monetizable ones during some system updates is actively user hostile. Making it difficult to ensure “we’re rebooting you to apply an update” doesn’t kick in at the wrong time is at least neglectful, if not actively user hostile.
If he were talking about what you seem to be focused on, the complaint would be that the iPhone-ification of computing as encouraged in things like About Face 3 (A big-name UI design book from 2007 which points to things like “Maybe users don’t need to think about files. It works for iPhoto”) results in a hollowing out of the center of the skill continuum by setting up skill cliffs and “you must be at least this motivated to ride” traps for transitioning from novice to power user to expert.
(i.e. Doing the lazy thing and making UIs friendly for novices by designing them as playpens that users have trouble climbing out of. The part of that trend which came into existence when we moved away from unbrickable, boot-into-BASIC 8-bit micros is the whole reason the Raspberry Pi was invented.)
The vast majority of computer users in the 80s through the mid 90s were office workers, receptionists, accountants, and similar mundane positions. They were not geeks, they were not enthusiastic about the machines they used, but they somehow managed to get by just fine.
I don’t disagree that computing devices these days, especially mobile devices, are easier than ever for anyone to understand. However, that is a product of the overall shift towards truly personal computing that started with the advent of the World Wide Web in the early-mid 90s. Before that, home computers were either the parents’ way to bring their work home, or a game machine for the kids (sometimes both in one machine). A smartphone without the Internet is no longer smart, it’s just a phone.
@ Morgan. True, I should have added that users back then were either geeks or had no choice/had to use the devices.
The point is that users back then put up with stuff that could be viewed as being user hostile, in their own right as well. Their enthusiasm or the fact that they had no choice but to put it with.
Whereas right now, stuff is so much more comoditized and access to computing has been extremely normalized. And there is a huge volume of work on how to make computing accessible to consumer. And in a sense the numbers don’t lie.
Xanady Asem,
I would point out that people today still have no choice but to use the devices for work. This has not changed and it does not imply that enshitification of technology was prevalent in the past. Do you have a specific example that shows it’s not gotten worse?
My mother can now go to the store pickup a laptop/tablet/phone and have it all running up and working without my help. QED
Xanady Asem,
My mother *did* go to the computer store, used prodigy, word perfect, and played Carmen Sandiego. Far be it from it being too complex for her, she’s the one who taught us how to use the computer.. This wasn’t unusual for the 90s and I think most of the families we knew had one. It seems like you have a major gap in your knowledge of the period and rather than asking people you are filling it in with assumptions.
I always make the mistake of replying to you in good faith, only to be reminded of who you are and why you’re here.
Bye.
Xanady Asem,
Meh, that’s a cheap deflection. Your QED wasn’t very conclusive. The fact is regular people did buy and use computers and they are the reason computers and online services exploded in popularity in the 90s. Why don’t you think the same type of people using windows 11 computers today would not have been able to use windows computers in the 90s? The overall way we interact with desktop computers is largely the same. I’m not saying nothing has changed, but the usability gap is not nearly as big or difficult as you’ve been claiming.
Edit:
If you want to refine your point to say that DOS was more difficult I’d be more inclined to agree.
I couldn’t care less what the computing hobbyist peanut gallery from the early 90s has to say about technologies in the mid 2020s.
Xanady Asem,
That’s fine, nobody’s asking you to care about 90s computer users. However saying you don’t care doesn’t negate misrepresentations.
Xanady Asem,
Price was a pain point keeping technology out of the public, but as things became more affordable there was a ton of interest from ordinary users who weren’t particularly “geeks”. Families used computers for entertainment, education. Services like AOl, prodigy, which grew their market-share targeting non-savvy users. By the mid-90s computers were really exploding in the public space. Back then geeks like you and I were not the main audience. IMHO we’ve we’ve been in the minority for as long as computers have been mainstream and before that the main demographic was probably normal businesses doing spreadsheets and word processing rather than people who want to geek out about tools.
Why not simply… Let people be. If there is people who hate Windows (I am one of them), well, they can choose to go the macOS way, or the Linux way, or the *BSD way, or they can get experimental. Their choice. Same for people who hate macOS. Same for Linux/UNIX haters.
For me, macOS is the answer. I hate Windows, and I hate Linux: the former is just annoying and makes almost any computer slow; the latter is not “a breathe of fresh air”, but a pain-in-the-neck with people apparently fighting for some sort of supremacy and with a delusional high moral ground.
I really like this web page, except when we go from “OS news” to opinionated pieces that try to passive-aggressively push some ideas into others. Report on technology, on OSes, but leave your biases behind and be objective: let people decide for themselves.
Remiks_1981,
I agree that everyone use whatever works best for them, no need to be judgemental. There are some die hard elitists in our ranks, but linux doesn’t have a monopoly on that. Not to make a big deal about it, but the supremacy agenda is often culturally associated with apple products, which makes your point a bit ironic.
https://nypost.com/2024/10/07/lifestyle/are-iphone-users-petty-youll-be-surprised-how-many-wont-date-android-fans-survey/
You’ve got the comments section to balance things out 🙂