Linked by Thom Holwerda on Mon 9th Nov 2009 21:29 UTC
3D News, GL, DirectX Over the past few years, there have been persistent rumours that NVIDIA, the graphics chip maker, was working on an x86 chip to compete with Intel and AMD. Recently, these rumours gained some traction, but NVIDIA's CEO just shot them down, and denied the company will enter the x86 processor market.
Permalink for comment 393708
To read all comments associated with this story, please click here.
RE: a no go
by SamAskani on Tue 10th Nov 2009 01:07 UTC in reply to "a no go"
SamAskani
Member since:
2006-01-03

"... only on very specific tasks, and even there nothing so great that would give it a solid advantage"

Next time, please check the Nvidia Cuda website or gpgpu.org before make such comment

From my experience, for many problems a cuda program running with a simple gtx 260 gives you a factor between 80-100x faster than a 2 x quadcore 5020 (using openMP), and that even before doing substantial tunning for the coalesced memory access that makes huge differences in the performance. If giving two orders of magnitude in performance does not give a solid advantage, please tell Intel they can throw Larrabee to the garbage bin, that they don't need to compete with Nvidia in the GPGPUs arena.

In more than 20 years doing scientific computing I haven't witnessed such jump in performance by just plugging a card and do some recoding. Even if the learning curve is important, very quickly you get programs running much faster than before. We have ported around 20 programs in the last 3 months and the impact has been dramatic. Many programs that required 4 to 8 2xquad-core nodes (MPI+OpenMP) in a cluster to get results in a promptly manner are being replaced by their gpgpu counterparts and this let the cluster for the very long simulations.

Then, Nvidia is effectively having a substantial advantage here until Larrabee can come to reshape the market.

The big missing feature in Nvidia's line is the support to ECC but that is going to be addressed in the next generation of GPGPUs coming next year.

Reply Parent Score: 1