AnandTech has published its review of AMD’s surprise new high-end Radeon VII graphics card, and the results should be cause for some cautious optimism among PC builders.
Overall then, the Radeon VII puts its best foot forward when it offers itself as a high-VRAM prosumer card for gaming content creators. And at its $699 price point, that’s not a bad place to occupy. However for pure gamers, it’s a little too difficult to suggest this card instead of NVIDIA’s better performing GeForce RTX 2080.
So where does this leave AMD? Fortunately for the Radeon rebels, their situation is improved even if the overall competitive landscape hasn’t been significantly changed. It’s not a win for AMD, but being able to compete with NVIDIA at this level means just that: AMD is still competitive. They can compete on performance, and thanks to Vega 20 they have a new slew of compute features to work with. It’s going to win AMD business today, and it’s going to help prepare AMD for tomorrow for the next phase that is Navi. It’s still an uphill battle, but with Radeon VII and Vega 20, AMD is now one more step up that hill.
While not a slam-dunk, the Radeon VII definitely shows AMD can get at least close to NVIDIA’s RTX cards, and that should make all of us quite happy – NVIDIA has had this market to itself for far too long, and it’s showing in the arrogant pricing the company maintains. While neither RTX cards nor this new Radeon VII make me want to replace my GTX 1070 – and its custom watercooling parts – it at least makes me hopeful that the coming years will be more competitive.
So basically, this card is for the extremely rare type of user that cares BOTH about 3D creation AND high end gaming, using the same PC for both. (facepalm)
I really want to like their graphics cards, especially after Nvidia screwed up with the RTX line, but nothing has changed. It still kinda gives same performance at same price, but is using a lot more power while doing it and being noisier, and with the risk that you will eventually actually miss not having the RTX features added in, then to me, except if you are in the group i mentioned above, this seems dead in the water. It really seems like they need a very different architecture to get the power consumption down, and hence be able to not have to use hugely expensive memory because there is no room in the power budget for standard graphics ram.
I still hope they can manage to beat Nvidia in the midrange, but i am not holding my breath. At least they have the CPU market going for them.
Power usage is apparently due to a large overvolt at stock… which AMD seems to be planning to prevent on Navi (there has been news about a new power control framework for Linux being developed for a new power management subsystem on the cards).
If you check several review sites it shows that they undervolted thier cards around 5-10% and were able to beat 2080 power consumption without changing clocks. This also bodes well for Navi, as it means 7nm is power efficient….
It’s not so rare for people that game, to also want to use thier PC for more serious things as well. So, not sure why you care about that. The point stands that its has the best FP64/$ value on the market right now also.
The more important question is, what’s the ETH hashrate?
xero1,
The market dropped out for mining cryptocurrencies on GPUs at home. When the speculative value was through the rough, it could be justified, but the market took such a dive that the electricity costs can easily easily exceed the value of currency. It’s one of the reasons nvidia had bleak numbers last year. The future of crypto mining is clearly ASIC technology in data centers, but I’m even skeptical that will remain profitable if real people don’t start embracing these currencies. For all the hype over the years, I still don’t think it’s all that practical/scalable. A lot of people lose money because the services/exchanges are not FDIC insured and can disappear over night.
From just this month:
https://news.bitcoin.com/canadian-exchange-insolvent-after-ceo-dies-with-keys-to-145m-of-cryptocurrency/
s/rough/roof/
Also, the more important question is: does it run crysis? 🙂
Yeah, I was joking. A real question for everyone now though. What are the odds of getting MI50 drivers working on this card? Nvidia/Quadro AMD/(Pro) drivers working on non-pro hardware has happened before.
xero1,
I doubt they’d risk undercutting their pro line too much, those are $6000+ cards.
https://www.amd.com/en/graphics/servers-radeon-instinct-mi
Makes me wish I had a lot more money.
I’d be interested to hear from anyone working on these things!
@post by Alfman 2019-02-07 5:39 pm
Such are the risks of tulip manias…
With the final reveal of a 1/4 FP64/double precision instead of 1/16, this card will do a lot for scientific computing products like BOINC and certain mining applications.
The tragedy of AMD graphics cards is that they are very strong compute performers, but their software stack is just so behind NVIDIA’s that it’s a pity.
Let us not forget AMD Finewine, which for those that do not know AMD cards tend to do better over time as AMD takes longer to figure out how to squeeze max performance out of their drivers than Nvidia.
If you haven’t heard of Finewine go look it up on YouTube, several tech guys there crunched the numbers on AMD cards and how much they gained and by what timeframe and with AMD cards you didn’t see cards reaching max performance until 2-3 years after initial release, whereas with Nvidia cards they tended to have the most squeezed out by the 6 month mark.
TLDR? If these cards are close to a 2080 now they’ll be beating the 2080 in a year and most likely beating it by a good clip in 2, simply because it will take that long for AMD engineers to get the drivers maxed out.