Apple has dropped legacy frameworks very easily in the past though. But how exactly did that happen?
CPU changes. Once when MacOS went from PPC to Intel, and then once when MacOS went from 32 bit to 64 bit. Each time that transition happened Apple was able to say “OK, this legacy stuff just isn’t going to be there on the new architecture”. And since you had to recompile apps anyway to make them run on the new architecture, developers kind of shrugged and said “Well, yea. That’s what I would have done too”. It made sense.
So are we about to see 128 bit Intel processors anytime soon, to facilitate this change? I doubt it.
OK then, what about a new architecture?
Oh. Hello 64 bit ARM.
The Macintosh platform is going to transition to Apple’s own ARM64 architecture over the coming years. The most succinct explanation as to why comes from Steven Troughton-Smith:
Opening ARM-based Macs to the iOS ecosystem to make one unified Apple platform, knowing what we know about Marzipan, makes so much sense that it becomes difficult to imagine it any other way. Apple finds itself completely unable to build the computers it wants to build with Intel.
Windows has already made the move to ARM, and macOS will be joining it over the coming years. There is a major architectural shift happening in desktop computing, and there are quite a few companies who have to worry about their long-term bottom line: Intel, AMD, and NVIDIA.
>”There is a major architectural shift happening in desktop computing, and there are quite a few companies who have to worry about their long-term bottom line: Intel, AMD, and NVIDIA.”
I don’t think NVIDIA has to worry about their bottom line for the coming years. They make GPUs that are commonly seen paired with x86 architecture, sure, but they also make ARM chips and provide mobile gaming GPUs too. Even though Apple probably won’t do business with them for awhile, I doubt Microsoft and various other device makers feel the same. It’s not like NV is dependent on x86 architecture…
Likewise, for professional video editors and other workstation users / “prosumers,” they’ll probably just use external video cards if internal isn’t an option. Doing so would accelerate their workflows on Macs that are ARM-based (with lesser GPUs), supposing external GPUs continue to have support there.
NVIDIA would still be a major player in such a scenario. So the notion that NV will have to worry seems rather… odd, to me.
This is also under the assumption that an Apple transition to ARM will have a meaningfully negative effect on x86. That is, I don’t really think Apple transitioning to ARM in their Macs over time is going to kill any bottom lines, just like Apple transitioning to Intel in their Macs didn’t kill Power chips (from my ill-informed un-researched understanding).
(I acknowledge that the situation is different too, but this all supposes x86 would be abandoned in a major way.)
Could be wrong but it feels to me like “the (x86) enthusiast market” is a bit bigger than the “Mac market” to me. I’m just speculating and haven’t really done any research into this. Just noting my impression I guess.
Anyway, all that said, I’m definitely not against Apple’s ARM chips making it to Macs. I’d like that competition. Plus since Windows supports ARM I guess Bootcamp might even still be a thing, somehow. I hope to see the same sort of PC enthusiast market that exists for home-built x86 computers on the ARM side, too. Such that I can just find whatever compatible motherboard and ARM processor, slap ’em in a case, plug a GPU and RAM in, etc, and be ready to go. All that needs to exist is a standard and support for that standard (and a market for it). If that happens, NVIDIA *really* won’t be in trouble.