Since the disastrous launch of the RTX 50 series, NVIDIA has been unable to escape negative headlines: scalper bots are snatching GPUs away from consumers before official sales even begin, power connectors continue to melt, with no fix in sight, marketing is becoming increasingly deceptive, GPUs are missing processing units when they leave the factory, and the drivers, for which NVIDIA has always been praised, are currently falling apart. And to top it all off, NVIDIA is becoming increasingly insistent that media push a certain narrative when reporting on their hardware.
↫ Sebin Nyshkim
Out of all the issues listed here – and there are many, and each is bad enough on their own – it’s the frame generation and related pressure campaigns on reviewers that really get on my nerves the most. Technologies like DLSS (rendering at a lower internal resolution scaling that up) and frame generation (injecting fake “AI” frames to jack up the frame rate) can be fine technologies when used at the consumer’s discretion to find a balance between improved perceived performance in exchange for blurry image quality and artefacting, but we’ve now reached a point where NVIDIA will only boast about performance figures with these technologies enabled, downsides be damned.
If that wasn’t misleading enough, the company is also pressuring reviewers who don’t enable these technologies, and focus on real frames, real resolutions, and this, real performance. If you don’t comply, you’re not getting the next crop of GPUs in early access. It’s the kind of shit Apple pulls all the time, and we need less of it, not more.
Just don’t buy NVIDIA. They’re already a terrible choice if you’re running anything other than Windows, but the company’s recent behaviour and serious missteps have made the choice for AMD or Intel only more obvious.
I bought an Intel dedicated GPU, but I think even a recent integrated one would have been ok. I mostly care about video editing when it comes to acceleration, and Intel is very good at that. Since Nov 2023, the Linux drivers were updated to support DaVinci Resolve, so I’m good with that (even if Resolve still lists them as non-supported, they work). Even with Blender, with a fast CPU + the Intel card, it’s good enough for simple stuff. I don’t play 3D games, they make me dizzy (I’m one of those people who throw up after 5 minutes in a 3D environment, either via games, 3d movies, or VR).
With AMD ceding the high-end and NVIDIA pissing everybody off, it seems like there could really be a moment here for Intel to insert a Battle Mage B780 and make a dent.
> Just don’t buy NVIDIA.
Why not buy previous gen instead?
“It’s not about the money, it’s about sending a message”
NVIDIA is worse than Apple or Microsoft at forced obsolescence.
Example: Lenovo P52 supports Windows 11, will not be supported by NVidia after the 580 series driver.
Why play the game any more?
To be scrupulously fair, only citing that cutoff is misleading. nVidia maintains ‘legacy’ drivers from older series to support cards dropped from current series drivers. Legacy drivers may get bug and security fixes, but no new features. If you need CUDA 12.9 and only have a 10xx series card you’re gonna be shit outta luck. But, that doesn’t mean the hardware stops working at all once 12.9 becomes final.
They are not.
They do release drivers for old cards, but they are untested and full of bugs.
I genuinely wonder if there wasn’t some collusion here. Still usable laptops that were going to be supported by Linux and not Windows 11 posed a threat to the business model.
Now suddenly those same laptops are going to be made obsolete because of NVidia, forcing those people who might have switched to Linux to go out and buy a new Windows 11 laptop.
Because you don’t know whether a used GPU has been used to mine cryptocurrency (a process that is heavily taxing on the VRAM chips even if the GPU has been underclocked).
Because used stuff always carries a risk and may come with zero remaining warranty (or come with some months of remaining warranty which doesn’t cover user-induced damage such as water damage anyway), and as such, used is not everyone’s cup of tea.
I didn’t mean buying second-hand.
Nvidia has thought of that, so they stopped RTX 40-series production two months before the RTX 50-series launched (save for the bottom-end RTX 4060):
https://www.pcgamer.com/hardware/processors/nvidia-has-reportedly-killed-production-of-all-rtx-40-gpus-apart-from-the-4050-and-4060-as-affordable-50-series-gpus-could-arrive-earlier-than-expected/
Is there any new RTX 40-series stock left for the RTX 4060 Ti and higher? With demand being high and all…
Currently I’m using 3060 on my gaming rig and I’ve been thinking about upgrading for some time. Additionally, the bad reviews about the poor support of older technologies by the RTX 50 convinced me that it’s not worth it to be interested in the current line. Back in 2022 I paid twice as much for 3060 as the widely available 4060 cost today.
It was actually your response that got me interested in buying it NOW.
Mostly agree, and probably on desktop just buy AMD unless you want the top of the top. But what happens with laptops? AMD has on paper the best integrated GPUs which are generations above the competition, which can even compete against the discrete laptop GPUs, but they announce and announce new processors which never arrive on real laptops, and when they do it is late, and on incredibly expensive laptops…
If you have to pay more than 2000 bucks for a laptop with an integrated GPU just to do light gaming, you may as well decide you are going to pay 1200 for one with a discrete nvida GPU and just endure double boot and their terrible drivers on linux. Steam on linux is fantastic, but AMD needs to get their shit together when it comes to laptops.
They play these games because (right now) their lead is unassailable.
In (desktop grade) cards they clearly lead AI, games and productivity (rendering).
AMD and Intel were left behind and the market is now so advanced its basically impossible for anyone new to enter.
If Nvidia is the only one supplying those “so advanced” cards, it’s hardly a “market” is it? It’s a monopoly. Regardless, I doubt that there are many secrets inside Nvidia GPUs that AMD or Intel aren’t aware of, it’s more that they don’t see the need to compete, as it’s a pretty brutal “market”.
It’s unfortunate that a few years ago Nvidia “won” in the eyes of gamers (and Nvidia fan boys). Frame rates stopped mattering, prices stopped mattering, benchmarks stopped mattering.
It all came down to this lazy story of “If you don’t have an Nvidia card, you’re not serious about gaming”. Nvidia latched on to that narrative and has been riding it ever since… actual technology be damned.
* AMD cards work *great* on linux without bloatware binary drivers, but somehow it’s Linux’s fault for not “supporting the closed Nvidia ecosystem well enough”
* Intel GPUs are up and coming, but who would run an Intel GPU to game on?
If you’re reading this, and have been listening to the Nvidia-only people… try a different GPU.
I’m Nvidia-only because of the software I use. Daz Studio only does Iray on Nvidia. It’s a requirement otherwise it renders using the CPU only!
Gamer here. Have not had a nvidia gpu for well 15 years becuase well unless you are insane and going to the max amd wins on price point (and 2, maybe 3 updates is still cheaper than nvidas high end offering).
Yes ok, nvidia on my laptop, and even less chance of replaceing it with amd now (it does need repalcing this year or so), becuase there are no (well hardly any) amd options (except the major reduction of non dedicated gfx which is still going to be a downgrade). But I do not use muc hand do not care much. It’s a holiday machine.
My 9070 XT was fairly priced and works good. I am not chasing the insane and bad quality high fps of some nutters. They are also silly.
> If you’re reading this, and have been listening to the Nvidia-only people… try a different GPU.
These people are not here.
Nvidia really doesn’t want you to know that there’s no significant performance uplift in DLSS-less rendering compared to the RTX 40-series, you only get a 20% bump in DirectX12 tests (which sounds okay until you realize the GPU gulps down 30% more watts and costs significantly more). The performance uplift for older APIs is even smaller.
Which is to be expected, the RTX 50-series is built on the same process node as the RTX 40-series, so they only have official overclocking plus whatever architectural improvements they could come up with to extract more performance. Not that it hurt Nvidia, they still sold out. But they also want to pretend the RTX 50-series is a worthy upgrade for RTX 40-series owners too, so of course they’ll strongarm reporters to get the reporting they want.
The best explanation I have for NVIDIA’s current behavior is that they let AI considerations drive the design of the 50xx series and then got caught flatfooted trying to sell the 50xx series to gamers (for whom it is not really an upgrade). If you’ve got an RTX 3060 or 4060 Ti with 8-12 GB VRAM and upgrade to, say, a 5060 Ti with 16GB VRAM, you’re not gonna see any performance improvement ’cause your VRAM isn’t what’s bottlenecking you.
But for someone running e.g. SDXL or WAN Video or LLM models, etc., 16GB of VRAM on a 5060 Ti is a big deal. You could slap 16GB of VRAM on a potato and outperform a 4090 with 8GB if you’re working image tiles at e.g. 2048×2048. Basically, looks like what happens when any already-large company grows too fast; you get products optimized for one job pressed into service for another, with all the slapdash mid-cycle design tweaks, quality control failures, software lag, and marketing issues you’d expect when that happens.
So while the best thing would be for Intel and AMD to get their shit together on GPU compute applications, the second best would be for NVIDIA to learn from this mess and fork their GPU products into lower-powered, high-VRAM GPUs for CUDA applications and higher-powered, whatever-VRAM GPUs for gamers.
As the linked article says, GPUs for gaming and visualization have already become a sub-20% concern for Nvidia:
https://www.visualcapitalist.com/nvidia-revenue-by-product-line/
And with Nvidia selling datacenter GPUs faster than they can make them, don’t expect them to assign their best and brightest people into GPUs for gaming and visualization. They weren’t called flatfooted, they just don’t care much anymore.
In fact, they barely have an incentive to evolve their datacenter GPUs (for example use a newer process node), because again, they’re selling what they already have as fast as they can manufacture it.
Also, I disagree with the VRAM situation. Just because an 8GB GPU runs games now, it doesn’t mean it will 3-4 years from now. Remember when the 8GB of a 3070Ti were more than enough? Now it’s barely adequate.
IDK, maybe they should stop letting their AI build and program all of their products…
NVDA may be a terrible company, but their linux drivers work just fine
As usual I do not get the shock and rage against a corporate entity, they do what they do to profit, they are not here to look after you morally or financially, you are an object to be harvested. Why would anyone expect anything different?
Much of the rage is just sabre rattling.
Nvidia, Apple, AMD, IBM, MS, Dell, HP, etc., etc., etc.. Different names, different products, same behaviour. Be smart about how you deal with them, make use of the products they provide in a manner that works for you, to your advantage, if a product doesn’t work for you don’t buy it, you generally can’t change the product or the corporation.
In a world of user unfriendly, barely functional, breaks every other day and twice on sunday shit that is the computing industry, Nvidia reiterates solid best of class products every time. They are reliable and a joy to use.
Would be cool if there was less whining from the entitled classes. I’m not going to settle for less, and the constant crying wolf about minor issues as if they affect anything but a tiny sliver of the user base just makes me laugh now.
We’ve found the one Nvidia fanboy!
And there’s the typical hivemind response from the small but loud AMD brigade
ask yourself why would nv condition users to accept frame latencies of 50-100ms all of a sudden; overall 3++bln ppl worldwide are ‘gamers’ and that number won’t fall, nv drives the biggest farms now, all of it is basically gaming gpus, and while current ‘ai’ market global is estimated ~100-200b$, ‘gaming’ is aboe 300; thus the most obvious universal pivot, should ai pop even a bit, is back to square 1 = but while at it they would reroute mass-fragmented hw/sw ownership into cloud streaming, thus altogether killing the ‘gaming’ hw category and simply making every single device with screen and bt a gaming device, for streaming; except, as long as people are muscle-memory ‘used to’ sub 50ms latency, cloud streaming is a so-so ordeal, but, once reconditioned, might as well get 60-500fps ai-interpolated video streams; nv wants to play 2 birds 1 stone;