Linked by Thom Holwerda on Tue 9th Feb 2010 16:18 UTC
3D News, GL, DirectX Notebooks with dual GPUs have been shipping for a while now, but switching between the fancy discrete GPU and the low-power integrated one hasn't exactly been painless. Today, NVIDIA introduced a technology called Optimus, which makes the switching process automatic and transparent. In Windows, that is.
Order by: Score:
Mac OS X Drivers
by mckill on Tue 9th Feb 2010 16:34 UTC
mckill
Member since:
2007-06-12

You make it sound like Apple writes their own Nvidia drivers, they don't, it's all Nvidia.

Reply Score: 1

RE: Mac OS X Drivers
by PlatformAgnostic on Tue 9th Feb 2010 16:57 UTC in reply to "Mac OS X Drivers"
PlatformAgnostic Member since:
2006-01-02

Support is still needed higher up in the graphics stack to make this work.

Reply Score: 2

RE[2]: Mac OS X Drivers
by Thom_Holwerda on Tue 9th Feb 2010 17:02 UTC in reply to "RE: Mac OS X Drivers"
Thom_Holwerda Member since:
2005-06-29

I can't find the link atm, but NVIDIA once said that all the support is there, and it's only Apple that can now bring it to Mac OS X.

Reply Score: 1

Xorg development
by manjabes on Tue 9th Feb 2010 17:18 UTC
manjabes
Member since:
2005-08-27

However, with the pace of Xorg and Linux development, the situation could already be different today.

No, it is not. Be honest. Xorg is still struggling with multidisplay support. Heck, it STILL after 20+ years of development struggles with a single goddamn display. The two last times (~1 yr each) I tried using a Linux-based system full-time, the plenty-a-day xorg crashes reconviced me that my time was worth too much to spend it tinkering with the steaming pile that is Xorg. Still the same, every time I try. And that's on a system that's supposed to work well with Linux. But not even the mighty open-source ecosystem has managed to fix this. Time is only spent on crying how hw manufacturers create shitty (AND/OR closed-source) drivers.

Linux, yes, is developing at a fast pace. But not Xorg.

Edited 2010-02-09 17:20 UTC

Reply Score: 4

RE: Xorg development
by CoolGoose on Tue 9th Feb 2010 18:10 UTC in reply to "Xorg development"
CoolGoose Member since:
2005-07-06

I'm really trying to figure out how the heck do you have problems with xorg on a single monitor display, heck even on two (if you consider proprietary drivers)

Any kind of a "special" hardware ?

Reply Score: 1

RE[2]: Xorg development
by Bill Shooter of Bul on Tue 9th Feb 2010 18:45 UTC in reply to "RE: Xorg development"
Bill Shooter of Bul Member since:
2006-07-14

Yeah, I haven't had X crash on me since 97. Is it because I'm lucky or that I only use open source drivers?

Reply Score: 9

RE[3]: Xorg development
by CoolGoose on Tue 9th Feb 2010 18:58 UTC in reply to "RE[2]: Xorg development"
CoolGoose Member since:
2005-07-06

I'm only using the nvidia proprietary ones on two pc, one of witch is my daily work one.

Reply Score: 2

RE[2]: Xorg development
by manjabes on Wed 10th Feb 2010 14:12 UTC in reply to "RE: Xorg development"
manjabes Member since:
2005-08-27

Any kind of a "special" hardware ?

I wouldn't consider an Intel integrated graphics chip that special.
It is also rather ironic that I specifically bought a new computer with an Intel graphics chip rather than a nVidia discrete one, just so that it would work great with Linux. I even suppressed my inner urge to play games with my computer, just for that. And turns out, now I'm on windows with a sub-par graphics chip all along.

Kinda explains my poisonous attitude though...

Reply Score: 2

RE[3]: Xorg development
by boomn on Thu 11th Feb 2010 17:36 UTC in reply to "RE[2]: Xorg development"
boomn Member since:
2010-02-11

Xorg itself probably isn't unstable in this case, Intel's driver is

Intel certainly has a good reputation for releasing open source drivers for their chips, but some recent cards have apparently been accompanied by poorly written drivers

Reply Score: 1

RE: Xorg development
by Tuxie on Tue 9th Feb 2010 18:11 UTC in reply to "Xorg development"
Tuxie Member since:
2009-04-22

"Time is only spent on crying how hw manufacturers create shitty (AND/OR closed-source) drivers."

Guess why?

Xorg could create a thousand killer features and write millions of lines of perfectly bug free code that does everything anyone could ever dream of, but it still wouldn't mean shit to most end users as long as they rely on shitty, closed source drivers who want to do things their own way and don't keep up with the development pace.

Yet you put all the blame on Xorg...

Are you really that surprised that people are frustrated?

Reply Score: 6

RE[2]: Xorg development
by google_ninja on Thu 11th Feb 2010 20:07 UTC in reply to "RE: Xorg development"
google_ninja Member since:
2006-02-05

He has an intel card, so it is shitty, opensource drivers for shitty, open hardware

Reply Score: 2

RE: Xorg development
by cerbie on Wed 10th Feb 2010 06:59 UTC in reply to "Xorg development"
cerbie Member since:
2006-01-02

Don't forget the crappy open source drivers, and driver-level features that should be core X features.

X is a minefield of crappy politics, even without closed source drivers, which would be very easy to deal with (stop supporting old crap, and let nVidia and everyone else catch up, if they want).

Also, isn't it funny how a Windows+nVidia news item becomes about X and FOSS?

Edited 2010-02-10 07:00 UTC

Reply Score: 3

RE: Xorg development
by jabjoe on Wed 10th Feb 2010 11:28 UTC in reply to "Xorg development"
jabjoe Member since:
2009-05-06

Urg, please people read up on XOrg and what is happening.
There is a lot happening. There is crying about closed source slowing things, but there is still lots of action. Biggest of which is drivers being moved out of X into the kernel. That's what the big fuss with KMS,DRM and Gallium3D is. NVidia aren't joining in because it doesn't fit with there cross platform driver development, as it would make the Linux driver completely different than the other platforms. But this matters less and less, because Nouveau is really picking up speed now, and is increasingly becoming a viable alternative. The Nouveau drivers are keeping up with XOrg development. Once the drivers are out of X, the code base shrinks massively and it won't neeed to run as root anymore. This all means it makes easier to write X reimplimentations or alternatives, like Wayland.

Xorg is not lacking multidisplay support. I've been using it at home for years. Normally now it's all done with Xrandr bar the NVidia closed drivers, and as I said, that won't matter soon enough as they won't be needed.

Reply Score: 2

A million times worse?
by Envying1 on Tue 9th Feb 2010 18:38 UTC
Envying1
Member since:
2008-04-22

From what you have described, I don't see why Apple is a million times worse! W/O knowing whether Apple have implemented it or not, and what it would take for them to do so, your complaint is only baseless.

It really is a step Nvidia should have taken to make the switch easily controlled from Hardware side, then all other companies or group implement it on OS side.

Edited 2010-02-09 18:44 UTC

Reply Score: 1

Always a kludge.
by Drumhellar on Tue 9th Feb 2010 18:44 UTC
Drumhellar
Member since:
2005-07-12

I've always thought using a dual-GPU setup was kinda kludgey.

Wouldn't it be better to enable/disable separate cores on the GPU individually based on usage, rather than have two cores, one of which is completely on and the other just idling?

AMD and Intel CPUs can do this.

Reply Score: 3

Linux Optimus not gonna happen
by _txf_ on Tue 9th Feb 2010 19:11 UTC
_txf_
Member since:
2008-03-17

It would be an absolute minefield to try and get this working with xorg. Intel drivers are now in the kernel, nvidia drivers are not. As I understand it (I could be wrong), the nvidia driver loads the igp drivers so it is able to use standard api calls to pass stuff around.

Then there is the issue with the fact that as good as the nvidia driver is on linux (stability and performance), in infrastructure (kms, randr1.2, responsive powermizer) they are awful.

Sigh, I guess it is time for me to move to osx as it will get this eventually, possibly for the next macbook pro refresh. I like having a massive amount of battery time,but sometimes I really need power. Until now linux has been fairly comparable, it wont be anymore (at least not for a while)

Edited 2010-02-09 19:24 UTC

Reply Score: 3

RE: Linux Optimus not gonna happen
by jabjoe on Wed 10th Feb 2010 11:32 UTC in reply to "Linux Optimus not gonna happen"
jabjoe Member since:
2009-05-06

Watch the Nouveau driver development. It already addresses some of your issues with closed Nvidia drivers, some of which is why there is such pressure to replace them.

Reply Score: 1

lemur2 Member since:
2007-02-17

Watch the Nouveau driver development. It already addresses some of your issues with closed Nvidia drivers, some of which is why there is such pressure to replace them.


xf86-video-nouveau
http://nouveau.freedesktop.org/wiki/
http://nouveau.freedesktop.org/wiki/FeatureMatrix

xf86-video-ati
http://www.x.org/wiki/radeon
http://www.x.org/wiki/RadeonFeature

and xf86-video-intel
http://intellinuxgraphics.org/

All three of the main open source video drivers for Linux support Randr12 and KMS

http://www.phoronix.com/scan.php?page=article&item=927
Some of the common free software drivers that support RandR 1.2 include Nouveau, nv, xf86-video-ati, Avivo, and last but not least RadeonHD.


RandR is the "Resize and Rotate" extension in X.Org and the v1.2 update introduces new functionality such as dynamic hot-plugging support for display devices.


http://intellinuxgraphics.org/dualhead.html
This guide is targeted for people who want to use extended desktop mode on two outputs. Clone mode should work out-of-box with a normal configuration.

With RandR 1.2, you can setup dual head and add/remove monitor dynamically (i.e. on-the-fly, without restarting X).


Linux desktops have configuration settings for multiple-monitor set-ups:

http://ourlan.homelinux.net/qdig/?Qwd=./KDE4_desktop&Qif=Resize_and...

My but there is a lot of FUD spread about these days concerning Linux and open source drivers for it.

Edited 2010-02-10 13:18 UTC

Reply Score: 1

Comment by jal_
by jal_ on Wed 10th Feb 2010 17:13 UTC
jal_
Member since:
2006-11-02

The system works by basically having the discrete GPU send its framebuffer contents to that of the integrated GPU as soon as the former is powered on.


I know this is taken from the PCWorld article (can't find any mention in the NVIDIA press releases), but it seems kinda silly: even though the fast GPU is "discrete", wouldn't it use main memory for its frame buffer? In which case there's no need to copy anything.

Reply Score: 1