Linked by Thom Holwerda on Mon 9th Nov 2009 21:29 UTC
3D News, GL, DirectX Over the past few years, there have been persistent rumours that NVIDIA, the graphics chip maker, was working on an x86 chip to compete with Intel and AMD. Recently, these rumours gained some traction, but NVIDIA's CEO just shot them down, and denied the company will enter the x86 processor market.
Thread beginning with comment 393685
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE[2]: a no go
by lucas_maximus on Mon 9th Nov 2009 22:58 UTC in reply to "RE: a no go"
lucas_maximus
Member since:
2009-08-18

Windows has damn good GUI acceleration. Desktop Windows Manager which is responsible for accelerating the Windows 7 GUI is taking up 26Meg of ram according to task manager, Skype is using 28Meg.

Windows 7 has pretty good GUI acceleration which doesn't eat up my memory and is damn responsive on this Integrated graphics card.

Anyhow where does this myth come from that you need a graphics card accelerating the GUI otherwise it degrades performance horribly? ... unless your app is using software 3d acceleration or heavily relying on the graphics card to help it with additional processing (Adobe CS4) any CPU in the last 6 years can do at least 1280x1024 any application degradation that you are likely to notice.

Reply Parent Score: 2

RE[3]: a no go
by lemur2 on Mon 9th Nov 2009 23:15 in reply to "RE[2]: a no go"
lemur2 Member since:
2007-02-17

Windows has damn good GUI acceleration. Desktop Windows Manager which is responsible for accelerating the Windows 7 GUI is taking up 26Meg of ram according to task manager, Skype is using 28Meg. Windows 7 has pretty good GUI acceleration which doesn't eat up my memory and is damn responsive on this Integrated graphics card. Anyhow where does this myth come from that you need a graphics card accelerating the GUI otherwise it degrades performance horribly? ... unless your app is using software 3d acceleration or heavily relying on the graphics card to help it with additional processing (Adobe CS4) any CPU in the last 6 years can do at least 1280x1024 any application degradation that you are likely to notice.


Windows, OSX, KDE4 and recently I believe also GNOME desktops use 2d graphics acceleration using a GPU.

I have an ATI HD2400 graphics card (a very low end card), but even this very modest and inexpensive graphics card speeds up the KDE4 desktop quite a bit through 2D acceleration. I am using the radeon open source graphics driver, which doesn't have 3D functionality as yet, and it won't have until Linux kernel 2.6.32 comes out.

PS: When kernel 2.6.32 comes out, since ATI GPUs are far faster than Intel GPUs, after a little while ATI GPUs will become the best option for Linux. They will have open source drivers integrated with the kernel, like Intel GPUs, but unlike Intel they will also have performance on par with nVidia GPUs.

Anyway ... the speed of desktop graphics are nicely enhanced by 2D hardware GPU acceleration when it comes to operations such as scrolling or resizing/moving windows, and also for "bling" enhancements like pop-up notifications, hihglights, widget animations, transparency, fade-ins and fade-outs and shadows.

Edited 2009-11-09 23:25 UTC

Reply Parent Score: 3

RE[4]: a no go
by phoenix on Tue 10th Nov 2009 03:34 in reply to "RE[3]: a no go"
phoenix Member since:
2005-07-11

Windows, OSX, KDE4 and recently I believe also GNOME desktops use 2d graphics acceleration using a GPU.


Doesn't Desktop Effects/Compiz require 3D acceleration, considering it uses Compositing support?

Reply Parent Score: 2

RE[4]: a no go
by kryogenix on Wed 11th Nov 2009 16:23 in reply to "RE[3]: a no go"
kryogenix Member since:
2008-01-06

Windows, OSX, KDE4 and recently I believe also GNOME desktops use 2d graphics acceleration using a GPU.


You're wrong. Vista (w/ Aero), OS X (Quartz Extreme) and KDE4 use 3D acceleration to enhance performance with the GUI and reduce CPU load. OS X was first in this regard starting with OS X 10.2 Jaguar. 2D acceleration has been used since the 80's.

I have an ATI HD2400 graphics card (a very low end card), but even this very modest and inexpensive graphics card speeds up the KDE4 desktop quite a bit through 2D acceleration. I am using the radeon open source graphics driver, which doesn't have 3D functionality as yet, and it won't have until Linux kernel 2.6.32 comes out.


I dare you to find me a card made in the last 15 years that is a simple dumb framebuffer with no 2D acceleration. 2D acceleration helps, but without 3D acceleration, your card is crippled in my book. I will not use a card if I can't get 3D acceleration and no, I am NOT a gamer.

PS: When kernel 2.6.32 comes out, since ATI GPUs are far faster than Intel GPUs, after a little while ATI GPUs will become the best option for Linux. They will have open source drivers integrated with the kernel, like Intel GPUs, but unlike Intel they will also have performance on par with nVidia GPUs.


I don't know, even though the NVIDIA drivers are closed source, they perform pretty damn well. Every open source 3D driver I've used has been underperforming buggy crap compared to the closed source drivers. Even the older Radeon 9200 drivers sucked compared to the official drivers. I just wish they'd hurry up with the 64-bit FreeBSD drivers.

Anyway ... the speed of desktop graphics are nicely enhanced by 2D hardware GPU acceleration


And they have been since the dawn of personal computing. Nothing new there.

Reply Parent Score: 1

RE[3]: a no go
by kryogenix on Wed 11th Nov 2009 16:14 in reply to "RE[2]: a no go"
kryogenix Member since:
2008-01-06

Anyhow where does this myth come from that you need a graphics card accelerating the GUI otherwise it degrades performance horribly? ... unless your app is using software 3d acceleration or heavily relying on the graphics card to help it with additional processing (Adobe CS4) any CPU in the last 6 years can do at least 1280x1024 any application degradation that you are likely to notice.


Don't think GUI acceleration, think OpenCL and parallel processing. A bad ass video card coupled with a nice quad-core CPU is like having a Cray on your desk.

The GPU ain't going anywhere. Even my integrated shared memory 9400M in my Macbook is useful with OpenCL.

Reply Parent Score: 1

RE[4]: a no go
by Brendan on Fri 13th Nov 2009 07:10 in reply to "RE[3]: a no go"
Brendan Member since:
2005-11-16

Hi,

The GPU ain't going anywhere.


While I mostly agree with what you're saying, the GPU *is* going somewhere. If you look at both AMD and Intel roadmaps, the GPU is going in the same chip as the CPU. To start with it'll be just low-end and/or low-power systems, but that's just a start.

With the memory controller built into Intel and AMD's CPUs now, there's a major performance advantage putting the GPU "on-chip" too; and if NVidia isn't careful it could end up with no way to compete - too slow to compete with "on-chip GPUs" in high-end systems, too expensive for budget systems and taking up too much space on ultra-portable motherboards.

-Brendan

Reply Parent Score: 2