AMD is gearing up to launch its next-generation Radeon RX 7000-series GPUs next month, and today the company shared more details about the cards’ pricing, performance levels, and the new RDNA 3 GPU architecture that will power all of its graphics cards for the next couple of years.
The launch begins at the high end, with the Radeon RX 7900 XTX and RX 7900 XT. AMD will launch both of these GPUs on December 13, with the 7900 XTX starting at $999 and the XT starting at $899 (cards made by AMD’s partners will surely push these prices upward a bit). Both of these price tags undercut Nvidia’s RTX 4000 series, which starts at $1,599 for the top-tier GeForce RTX 4090 and $1,199 for the RTX 4080.
Graphics cards have become insanely expensive. While AMD’s prices undercut NVIDIA, they’re still bonkers expensive. Assuming you’ll be able to even find them at these prices to begin with.
Subject line is crapola. But, clickbait I suppose.
They are now serving two masters: home users, and also business.
Back in the day, when I were to purchase a GeForce 4, or ATI Radeon 8500, the only viable application was desktop gaming. (Okay also desktop video acceleration. High bitrate video would not decode on the CPU properly).
Today, the GPUs from both companies can be used for machine learning, multimedia content authoring (3d rendering, video effects post processing), crypto mining, HPC applications, and probably other uses I do not know about. And even though nvidia tried to differentiate them for a long while with drivers, they gave up.
So, that $999 card can also help a professional with Adobe Aftereffects, or Blender and avoid paying for a “workstation” (Fire Pro/Quadro), or a “datacenter” version.
Add to this gamers having larger budgets than before (we are no longer 18 year olds, many of us have jobs with disposable income). And then the price makes sense.
Not that I agree with it, though.
Most people also don’t need the $1000 versions of these cards, and would be quite happy with the $300 versions. They just don’t know that, because marketing is very powerful.
Agreed, I can play every game I own between Steam, GoG, and actual CD/DVD media, at 1080p60 or better, with my RX 5700/Ryzen 5 3600 machine.
And for those who say “but what about 4K?”, I don’t even have a 4K monitor so it’s pointless. The gameplay and responsiveness mean more to me with my old tired eyes than a bump in resolution.
To give you an idea of the uselessness of playing at (too much) high res : https://www.youtube.com/watch?v=1y5jEK-72JQ
On the other hand AMDs top cards this round seem pretty much ideal for turning on all effects and maintaining high frame rates at 1440p ultrawide (which is basically half a 4k monitor)
Same here with my R5 3600 and my new RX6650XT which I picked up for…you guessed it $300 USD. Although I will be going up to an R5 5700G after Xmas simply because they are cheap and I like the thought of having integrated graphics to offload video rendering when I’m playing a game which sounds nice.
as for 4K I play some competitive online games and 4K is the kiss of death when every ms counts.
CaptainN,
That was not always the case.
When 1070 was announced, it was just two weeks behind 1080, and priced at a reasonable $379. It could even be found cheaper, or bundled with games even during early days.
https://en.wikipedia.org/wiki/GeForce_10_series
And it was much faster, cooler, and overall better than the GTX 970 it replaced. (
Again 970 was available early on).RTX 2070 was not too bad, It was not much faster than 1070, but came with ray tracing hardware, and only slightly more expensive ($499). And let us not forget those were the peak days of crypto craze, and chips were really difficult to find.
RTX 3070? Same, very short window of launch and still $499 price.
Now at RTC 4070? Nowhere to be found. Cheapest option 4080 is already cancelled. The new card uses massively more power, will not fit in many existing PC chassis, and will probably require a PSU upgrade for most people.
This is a real break from tradition. For the last 4+ generations, nvidia “enthusiast” tier was reasonable. Not anymore,
Enthusiast has never been reasonable, it was always x8/90ti or Titan. Which have traditionally had ridiculous pricing as the main value proposition, that target market is looking for, is highest generational performance not constrained by limited disposable income.
The x70 has never been the enthusiast tier as far as I am concerned.
I’ve been buying the 70 tier for a while, and had very good results. Over the years, 970, 1070, and 3070 (skipped 2070). They have consistently been “good enough” for 1080p or 1440p gaming with ultra settings. Sometimes even at 4K.
Why would this be unreasonable? The benchmarks also show the pattern:
https://www.techspot.com/review/2124-geforce-rtx-3070/
That still doesn’t change the fact that the 070 series is not an enthusiast tier.
javiercero1,
We might have different assumptions for enthusiast. That is okay. Some would assume that is the top-tier, I would put a “expert / pro” on top of that.
Anyway, for “high-mid range” 070 was the best choice. If offered all the architectural changes (like RT), but was still affordable.
Enthusiast tier has always been the highest performance per generation.
o70 has always been the low end of the premium tier or the high end of the value tier.
The 4070 for this generation will likely be the sku formerly known as the 4080/12GB.
The GPU market is changing because value tier is now mostly conceded to consoles and iGPUs. So NVIDIA is focusing on the high margin segments. At this point it also seem that NVIDIA is trying to unload their 3000 series stock so they will announce the other tiers for the 4000 later than during previous RTX product launches.
But GPU launches have always had all sorts of cadences and roll outs. It’s not like it’s some kind of product necessary for life or well being, so I have no idea about the emotional reaction to a product launch.
Premium tier graphics cards have never been cheap.
Most people are also not going to purchase premium tier, since most games are quite playable with mid range products @ 1080p/1440p, and most people are not doing machine learning/parallel processing development.
Halo products have always been there to steer most consumers towards the brand’s cheaper offerings.
javiercero1,
As above, “where is the medium tier for RTX 4000 series?”
For the latest wave of games (Plague Tale, 3070 can barely keep up at 1080p. What will happen next year?
The value tier hasn’t been launched.
You are picking a pathological title @ ultra settings. For the average case a mid tier card works just fine for 1080p. In fact the premium tier boards are a complete waste for that resolution.
Again that is my point.
For the last many generations, they launched the upper mid tier (1070, 2070, 3070) a few weeks after top models. And they were priced very well.
Today, there is not even a roadmap for a “mid” tier (which is 1060 equivalent), let along the upper mid (the 70s).
I don’t know where you think there are roadmaps regarding consumer products, those are usually not disclosed as to not cannibalize current products.
BTW, in other generations the halo product was launched first as well. It depends on what NVIDIA feels AMD’s performance level is at, since they tend to have a half generation advantage (historically). And both companies carry out their pricing structures accordingly.
NVIDIA is conceding the low end tiers to consoles and iGPUs. And since they aren’t going to get much revenue from those streams, they are going the apple model an concentrate on the high margin/premium tier.
Similarly AMD is conceding the high performance and pro tiers to NVIDIA and compete on pricing.
The value tier GPUs for this generation will come early next year. As both vendors are focusing their constrained supply chains on meeting the high margin streams volume first.
I don’t know what your point was honestly.
javiercero1,
I am not sure why we are going in circles.
At least since 970 times, the xx70 cards launched either at the same time, or in a few weeks of the top tier ones. Sometimes they even launched earlier.
https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units#GeForce_900_series
The answer is simple..go AMD. I’ve seen the RX 6700 going for sub $400 USD and it can play the new Plague Tale easily on high or ultra, your choice.
Lets face it Nvidia has been insane since the crypto craze started and it will be several years before sanity returns to team green.
Due to the chip demand/crypto boom, the graphics cards were being sold on ebay at whatever the market demanded without any effort by nvidia/amd. They saw what those prices were, and were understandably upset that scalpers got the lionshare of proffit on them. They now know how many people will pay 1k + for a card, and repositioned their pricing and offerings to reflect that knowledge.
Bill Shooter of Bul,
I would argue there were two additional factors, which are no longer holding.
1) As you said, crypto meant people were looking at these as investments (there were websites that calculated how many months would it require to pay off these cards). Since there is no crypto, I suspect not many people will look at these as investments anymore.
(Except professionals who use it for content authoring, machine learning, etc)
2) Stimulus payments. Even though many people actually needed them, for a non-trivial amount of persons this was “extra money”. So wasting a bit on scalpers was not a no big deal. Today not only we don’t have the stimulus, we are actually in an economic downturn.
So, I believe they are making a mistake if they think the $1,000 per GPU price will hold. Yes, some early adapters will buy it. But they already had the “Titan” series for these target demographics.
sukru,
My barometer on consumer demand is no better than yours, however you are focusing on demand side when supply side is just as important.
https://www.tomshardware.com/news/tsmc-warns-clients-of-up-to-9-price-hike-in-2023
These things are all linked together in various feedback loops with potentially complex interactions that are difficult to predict. I agree with you that prices are higher than historical norms, but so far I’m not seeing much indication of price deflation beyond the short term market corrections we’ve already seen relating to short term shortages. These inflated consumer prices could end up being the new normal, not just in tech, but for all goods and services.
Alfman,
I am probably just frustrated. And it is possible my next card will be an AMD or an Intel.
As per the economy. You are right, this is only one side. However, again per a single data point myself, it also affects the demand. When people are confronted with a steep price for their next card, they will either
1) Keep using the 3070 for a while
2) Switch to AMD/Intel
3) Accept to pay 3x price of their previous card
nvidia might have done some market analysis. But time will tell how many people will do (3), how many will do (2), and how many who has switched will come back in a future generation.
sukru,
I think many consumers are frustrated. I’m hoping intel is able to bring better competition in the coming years..we’ll have to wait and see.
I’m skipping this generation personally because I already invested in the overpriced 3080 TI, but if my upgrade cycle landed on this generation I probably would consider these GPUs because I want to be able to use cuda for better or worse.
Both companies are in an obvious price gouging collusion.