When macOS Ventura was announced earlier this month, its system requirements were considerably stricter than those for macOS Monterey, which was released just eight months ago as of this writing. Ventura requires a Mac made in 2017 or later, dropping support for a wide range of Monterey-supported Mac models released between 2013 and 2016.
This certainly seems more aggressive than new macOS releases from just a few years ago, where system requirements would tighten roughly every other year or so. But how bad is it, really? Is a Mac purchased in 2016 getting fewer updates than one bought in 2012 or 2008 or 1999? And if so, is there an explanation beyond Apple’s desire for more users to move to shiny new Apple Silicon Macs?
Unlike in the Windows world (at least, up until Windows 11) and the Linux/BSD world, Macs are more like smartphones or tablets in that support for them is regularly cut off well before the point they could no longer run the latest version of macOS. This has both advantages and disadvantages we don’t need to regurgitate here, but it’ll be interesting to see if the Apple Silicon era will accelerate the culling of older Macs.
Most money used to keep MacOS afloat is from hardware sales. It’s astonishing that they keep their long release lines at all supporting hardware made as far back as 2013 and don’t make decisions as this one, to do a early cut off, more often.
In the end, this is just a business decision to kill Intel on their platforms faster. Keeping a entire port of the OS for a architecture that you don’t intend to sell a single product anymore is expensive.
It’s not like the hardware will become paperweight although, who owns it can install another OS.
I went back some time ago and looked at the PowerPC to Intel transition, and from the time the first Intel MacOS was released to the last PowerPC MacOS retired was around five years (2004-2009). As much as the M[12] macs are a work in progress for some things, I wouldn’t buy an Intel Mac today and expect support for anything in four years. At least with the Intel Macs, you do have a fine selection of other OS to choose from until the hardware is EoL.
I have a couple Intel Mac minis (mid-2010 Core2 Duo) that I still use. Until recently one was my workstation (read: web browser and scripting dev) and one is a play “server” for trying stuff around the house. You can install Fedora on them with no sweat. They’ll probably end up as a two-node Kube cluster before long.
I have both Manjaro and EndeavourOS running on Mac hardware as well, old and new, and they work great. The oldest machine I have is a 2008 iMac and it is shocking how well it works actually considering that it was completely unusable with macOS ( opening Microsoft Outlook took minutes ). I like the screen and so I used it as a daily driver for a while ( under Manjaro ) and even things like Zoom and Microsoft Teams worked great. Certainly presentations, spreadsheets, documents, and emails were no problem. It is the machine I learned Docker on actually. I even finished Diablo on it ( DevilutionX ).
I was about to buy my first mac (PPC G4 powerbook) back right before they announced they were moving to Intel, it’s a great thing that I was holding off a bit, as I got laid off of my job right before I dumped the cash into it.
This time around I waited until they released the M1 and bought it. Though I still hate Finder, wish they’d make it actually behave like any other file manager I use 😛
@Leech
Try Path Finder from cocoatech.com
I’ve used it since 2007. Can’t use a Mac without it.
It’s dual pane mode is amazing too.
I hope Intel Macs won’t be declared dead anytime soon. How else would I be able to run MacOS in my VM?
The timer will start ticking when the Mac Pro (note: still sold, still using an Intel CPU at the time I am writing this) gets replaced or retired.
After the timer starts, you will most likely get 2 versions, just like PPC users did.
last powerbook came with tiger and got one single major update to leopard
Hackingtosh is in life support and soon to be dead now. At least until ARM Desktops don’t rise.
BTW I wouldn’t bet even for Apple Silicon Macs to be supported with upgrades for too long either. The industry’s solution to the fact Moore’s Law is slowing will be planned obsolescence. Even if the fabs keep to the letter of Moore’s “law”, power consumption per transistor is not going down at Moore’s Law cadence lately. And power per transistor is what matters most in literally all form factors today, not cost per transistor or transistor density. So, planned obsolescence to the rescue (of corporate revenue).
This is why the EU is trying to regulate 7 years of security updates for phones (feature updates are harder to legislate due to system requirements), so at least we get security updates and we won’t have to bin perfectly working hardware for security reasons:
https://www.gsmarena.com/the_eu_might_require_oems_to_provide_7_years_of_updates_and_spare_parts_for_its_phones-news-50824.php
Hope it goes beyond phones.
The funny thing is “Moore’s Law” is cited incorrectly almost all the time. It says nothing about power consumption, or processor speeds. But it rather talks about number of transistors in microchips.
And over the last 50+ years, it has been holding very steady:
https://en.wikipedia.org/wiki/Moore%27s_law
https://upload.wikimedia.org/wikipedia/commons/thumb/0/00/Moore%27s_Law_Transistor_Count_1970-2020.png/1920px-Moore%27s_Law_Transistor_Count_1970-2020.png
As I’ve said, that’s the letter of the “law”. In practice, the process gains nowadays come from lower power per transistor figures, because cooling systems and tricks like frequency boost have been pushed to their limits in all form factors.
sukru.
That is true. Still though for a period of time the effects of moores law really did apply to performance every generation even though it technically isn’t moores law. I do agree with kurkosdr’s point that there were major incentives to upgrade naturally. The diminishing gains we’ve seen with respect to performance compared to years ago means that companies may have to rely more on other ways of creating new demand. To an extent the addition of more cores has helped, but I think we’re already well into the diminish return curve for typical consumer applications where most cores remain unused. This is even the case for gaming, which is usually the most demand load for consumers. Same goes for memory: we can get oodles of it, but unless you have a particularly low end system it’s rarely the bottleneck that it used to be. Some of us do benefit from these things, but less so for average consumers.
Alfman, kurkosdr,
I would say this is mostly about money.
Previously, as chip making got better, the price also went down.
So, I can say things like: “My Raspberry PI 4 is not only 100 times cheaper than my old computers, it has more RAM than their HDDs!”
However recently the advancement did not come with monetary savings, at least on the mid to high end.
Yes, 3090 Ti is much more expensive, power hungry, and more difficult to cool. And yes, ThreadRippper is essentially over for the enthusiast market.
However an “i5” Intel 12500K will run circles around “i9” 9900k from 3 generations prior. Low to mid computing still benefits from the cost savings, even though high end is eaten by the server and professional segments.
Alfman, circling back to your point. I agree, a very low end CPU got good enough so that most desktop or laptop users no longer need to pay the premium for a Ryzen 9 or an i9.
To me, it makes good business sense that Apple would do this. They have a unique advantage currently with their new hardware, that advantage is only realized when everybody ports their software to the new platform, and the incentive to migrate will follow the user base. The faster Apple gets everybody off Intel the better. My read on the culture of the macOS installed base is that this will be seen as a welcome move and that anybody wishing to stay behind will have few defenders.
That said, I am one of those being left behind. I have several Apple machine–both MacBook Pros and iMacs. Only the one that my wife uses is still running macOS at this point as I have migrated the rest to Linux. My motivation for doing so was that the older machines were not getting updates. It is not just the OS. As soon as you cannot run the latest OS, some of the other applications you use start to become incompatible as they require newer OS versions. Even Microsoft Office requires a fairly recent macOS.
The reason that all my machines are running Linux now is first consistency but also that I have found that the hardware just works better with Linux ( minus some WiFi annoyances — disconnects — that I had to work around ). It turns out that I really like Apple hardware but no longer find their operating system as superior as I once did. Linux is my preferred OS these days but I also use Windows quite a lot and like it just fine. I still like macOS as well but, outside of one or two applications, I do not seem to miss it when it is gone.
To me, it makes good business sense that Apple would do this. They have a unique advantage currently with their new hardware, that advantage is only realized when everybody ports their software to the new platform, and the incentive to migrate will follow the user base. The faster Apple gets everybody off Intel the better. My read on the culture of the macOS installed base is that this will be seen as a welcome move and that anybody wishing to stay behind will have few defenders.
Agree, for many users it’s in Apple’s interest to get everyone to the ARM stuff quickly. It’ll be interesting how developers react. At least where I am, a lot of development is centered around containers, and as of yet there isn’t a good way I know of to run x86 containers on ARM Macs. The currently strategies (Docker Desktop, Podman machine) all rely on a an x86 Linux VM with some plumbing to connect the container tools running on the Mac to the container service in the VM. You can’t run x86 VMs on ARM Mac effectively. You can run an ARM Linux VM and use ARM container images, but that doesn’t really help if you’re deploying to x86 servers.
We’ll see how it plays out, but for me I’d rather have an x86 Linux system for building locally than have to faff about using Cloud VMs running x86 Linux to build images.
I agree. Even the virtualization companies seem to be focusing on running other ARM operating systems on Apple Silicon rather than allowing x86.
I think you can do it with UTM though:
https://mac.getutm.app/
UTM is basically QEMU which emulates almost everything under the hood. You can use it to run an x86 VM on and Apple Silicon Mac and then run something like Podman on top of that.
Hardware is going to change and it will be primarily driven by the right to repair laws, people are pretty poor judges when it comes to predicting how, but you can be sure it won’t reduce costs of newer hardware!
You might find you can extend the life of older kit, but they’ll almost certainly make incompatible new must have features that you just can’t live without at which time the dust will settle on the old gear!
This is a problem because Apple is leaving some of their machines unsupported. We have been developing our site https://top20writingservices.com/ for several years, where you can find reviews of the best writing services. This is one of the reasons why we refused to switch to MacBooks in our office for employees.