Notebooks with dual GPUs have been shipping for a while now, but switching between the fancy discrete GPU and the low-power integrated one hasn’t exactly been painless. Today, NVIDIA introduced a technology called Optimus, which makes the switching process automatic and transparent. In Windows, that is.
Despite many laptops already shipping with two graphics processors – a low-power integrated one and a discrete powerful one – software support for switching between the two has been a bit problematic. On Windows Vista and Windows 7, you needed to manually switch between the two via the power settings or the graphics tray icon. Upon switching, the screen would go blank for a few moments, et voilÃ .
On Mac OS X, the situation is a million times worse. Even though the hardware is fully capable of switching between the two GPUs “live”, Apple has never implemented the support for it in software, requiring you to log out if you want to use the discrete GPU and again when switching back. Then again, Mac OS X still doesn’t support SLI either, so little surprise there.
I’m not entirely sure what the situation is like on Linux, but from these recent Phoronix stories it would seem that there’s only experimental support for this. This experimental support still requires you to kill and restart the X server in order to get there, so it’s all rather crude. However, with the pace of Xorg and Linux development, the situation could already be different today. On top of that, it might be that some other project out there has already yielded better results – feel free to enlighten me.
So, Windows currently has the easiest switching method, but it’s still not ideal. What you really want is that the soft and hardware know when it’s time to power on the discrete GPU, so that users don’t have to worry about that sort of thing. When you load up a 3D-intensive program (a game, probably), the discrete GPU should automatically take over, without noticeable delay or flickering; if you close the game, and go back to browsing, the discrete GPU should power down. Preferably, Hybrid SLI should be used so that you can use both GPUs at the same time for optimal performance – when needed.
NVIDIA’s Optimus technology delivers just that kind of functionality. It will detect when an intensive 3D application is started, and will power on the discrete GPU accordingly, without user intervention, without screen flicker, without logging in or out. The only problem spot here is that the system relies on application profiles that you silently download from NVIDIA – in other words, if you happen to run into an application or game without a profile, you might still need to power on the discrete GPU manually.
The system works by basically having the discrete GPU send its framebuffer contents to that of the integrated GPU as soon as the former is powered on. This eliminates the blank/flickering screen during the switch. In previous iterations of dual-GPU laptops, both GPUs were connected directly to the display using multiplexers, which required the system to switch between two separate pipelines, causing flicker.
From the press release and Optimus web page it doesn’t become clear whether or not it also supports Hybrid SLI (using both GPUs at once), but I’m assuming that NVIDIA thought of that too, despite not mentioning it explicitly.
“Consumers no longer have to choose whether they want great graphics performance or sustained battery life,” said Rene Haas, general manager of notebook products at NVIDIA, “NVIDIA Optimus gives them both – great performance, great battery life and it simply works.”
A small number of new ASUS laptops (four, to be exact) support the new technology. Hopefully, the Linux NVIDIA driver will support this sooner rather than later, so we can enjoy Optimus on Linux as well. On the Mac OS X side of things – it all depends on Apple. Cupertino still doesn’t support SLI, and has so far refused to make switching as (relatively) seamless as it is on Windows (the hardware supports it), so I wouldn’t be surprised if it will take a while for Apple to catch up.
You make it sound like Apple writes their own Nvidia drivers, they don’t, it’s all Nvidia.