Linked by Thom Holwerda on Mon 14th Aug 2017 21:27 UTC
AMD

AMD isn't only getting back in the game on processors - they also just finally truly unveiled Vega, the new line of Radeon graphics cards. AnandTech benchmarked the two cards, and concludes:

Unfortunately for AMD, their GTX 1080-like performance doesn't come cheap from a power perspective. The Vega 64 has a board power rating of 295W, and it lives up to that rating. Relative to the GeForce GTX 1080, we've seen power measurements at the wall anywhere between 110W and 150W higher than the GeForce GTX 1080, all for the same performance. Thankfully for AMD, buyers are focused on price and performance first and foremost (and in that order), so if all you’re looking for is a fast AMD card at a reasonable price, the Vega 64 delivers where it needs to: it is a solid AMD counterpart to the GeForce GTX 1080. However if you care about the power consumption and the heat generated by your GPU, the Vega 64 is in a very rough spot.

On the other hand, the Radeon RX Vega 56 looks better for AMD, so it's easy to see why in recent days they have shifted their promotional efforts to the cheaper member of the RX Vega family. Though a step down from the RX Vega 64, the Vega 56 delivers around 90% of Vega 64’s performance for 80% of the price. Furthermore, when compared head-to-head with the GeForce GTX 1070, its closest competition, the Vega 56 enjoys a small but none the less significant 8% performance advantage over its NVIDIA counterpart. Whereas the Vega 64 could only draw to a tie, the Vega 56 can win in its market segment.

Vega 56's power consumption also looks better than Vega 64's, thanks to binning and its lower clockspeeds. Its power consumption is still notably worse than the GTX 1070's by anywhere between 45W and 75W at the wall, but on both a relative basis and an absolute basis, it's at least closer. Consequently, just how well the Vega 56 fares depends on your views on power consumption. It's faster than the GTX 1070, and even if retail prices are just similar to the GTX 1070 rather than cheaper, then for some buyers looking to maximize performance for their dollar, that will be enough. But it's certainly not a very well rounded card if power consumption and noise are factored in.

So, equal performance to Nvidia's competing cards at slightly lower prices (we hope), but at a big cost: far higher power consumption (and thus, I assume, heat?). For gaming, Nvidia is probably still the best choice on virtually every metric, but the interesting thing about Vega is that there's every indication it will do better on other, non-gaming tasks.

It's still early days for Vega.

Thread beginning with comment 647993
To read all comments associated with this story, please click here.
Very suspicious ....
by cade on Tue 15th Aug 2017 10:23 UTC
cade
Member since:
2009-02-28

The results/thinking should at least be normalized against the GPU's GFLOPS rating.


Use Tech PowerUp's GPU database and look at
the numbers.
https://www.techpowerup.com

Vega's high TDP's are because of much higher compute power (GFLOPS) value.

If your going to compare GPUs for gaming potential then you have to factor also the GFLOPS (compute) rating and if the game software itself is leveraging the compute-performance of the GPU. You can still be in a CPU-bound scenario while using a fast CPU coupled with a GTX 1080 due to bandwidth limitations of CPU/GPU interface and that the respective game engine relies too much on CPU for floating point calcs and not offloading enough of these calcs to a GPU-compute-pipeline.

When factoring in GPU-compute-performance, the Vega 64 should be compared with the 1080 Ti.

The Vega 56 compute-performance is significantly better than that of the 1080.

With direct-to-the-GPU bare-metal paradigm pushed by graphics API's like {Vulkan, Metal, Direct3D 12, Mantle}, the GPU's floating-point compute performance will be playing a more important role on the performance side (relative to CPU).


The ideal is when a game engine is able to adequately saturate both the rendering and compute "data lanes" and this would provide a good basis for review of GPU hardware in context of gaming.

Another complication is when games are optimized to different extents for different hardware (AMD versus NVidia).

At least with mining cryptos you have the potential to tap into the full compute power of the card and makes the GPU investment worthwhile.

For the more complicated gaming scenario, how much are you willing to pay to have the newest GPU used with game engines that may not access the full render/compute potential of the video card.

I suppose computers run more optimized as mining rigs than as gaming rigs; the former dealing with more simpler source code than the latter (mining program versus game program).

At the end of the day, both AMD and NVidia video cards will give a good gaming experience; the technology is there and working. However, I find it foolish to get nit-picky or even get too serious about game benchmarks concerning personal computers, computers which are designed to be used for a wide variety of tasks and especially not optimised for just gaming.

Gaming consoles are another issue since they are meant to be optimized for gaming and any associated benchmarks would/should be taken more seriously. However, consoles have the issues of upgradability and extent of shelf-life.

Reply Score: 2

RE: Very suspicious ....
by Kochise on Tue 15th Aug 2017 11:02 in reply to "Very suspicious ...."
Kochise Member since:
2006-03-03

The ideal scenario would have the hardware and software to actually work without having to make hypothesis. The consumer doesn't care if the card is more powerful on paper. Keep faith away from computer pragmatism.

Reply Parent Score: 1