Linked by Thom Holwerda on Wed 4th Dec 2013 18:06 UTC
Linux

"Joining the Linux Foundation is one of many ways Valve is investing in the advancement of Linux gaming," Mike Sartain, a key member of the Linux team at Valve said. "Through these efforts we hope to contribute tools for developers building new experiences on Linux, compel hardware manufacturers to prioritize support for Linux, and ultimately deliver an elegant and open platform for Linux users."

Mark my words: Valve will do for Linux gaming what Android did for Linux mobile. Much crow will be eaten by naysayers in a few years.

Order by: Score:
Android did for Linux?
by Ithamar on Wed 4th Dec 2013 19:15 UTC
Ithamar
Member since:
2006-03-20

You mean fork Linux (the kernel), use as little GPL software as possible, and minimise feeding changes back to upstream?

I sincerely hope (and looking at TFA, suspect) Valve will do much better ;-)

Reply Score: 5

RE: Android did for Linux?
by kurkosdr on Wed 4th Dec 2013 20:09 UTC in reply to "Android did for Linux?"
kurkosdr Member since:
2011-04-11

You mean fork Linux (the kernel), use as little GPL software as possible, and minimise feeding changes back to upstream?

You mean avoid dealing with the Linux upstream cabal in order to ship the product in time, scrapping the 20 year old cruft sitting above the linux kernel (X.org, the mess that is Linux audio) while building a more streamlined userland better suited to smartphones, and getting their patches rejected by the upstream cabal (wakelocks)?

Nokia tried to play by the rules with MeeGo, and look what happened to them. Everyone was shipping their new mobile OSes by Q4 2010, Nokia wanted MeeGo to put it in the Nokia N8, but MeeGo wasn't ready so the N8 shipped with freaking Symbian. Nokia themselves admitted "technical difficulties" with MeeGo as the reason for the huge delays. By the time they had hammered around the issues, the investors were getting impatient and Elop happened. Android would have been yet another perpetually in beta no-show if it had been based on "real linux".

Reply Score: 4

RE[2]: Android did for Linux?
by Ithamar on Wed 4th Dec 2013 21:33 UTC in reply to "RE: Android did for Linux?"
Ithamar Member since:
2006-03-20

You mean avoid dealing with the Linux upstream cabal in order to ship the product in time, scrapping the 20 year old cruft sitting above the linux kernel (X.org, the mess that is Linux audio) while building a more streamlined userland better suited to smartphones, and getting their patches rejected by the upstream cabal (wake locks)?


I have enough real life experience to know that _being_ upstream on product release might not be feasible (although possible), but not working with upstream at all, even after the fact, is quite different.

What was released on the first device had been years in the making, and in that time a lot could have been done with upstream. Even when not doing that, doing it after the fact is still very doable.

Nokia tried to play by the rules with MeeGo, and look what happened to them. Everyone was shipping their new mobile OSes by Q4 2010, Nokia wanted MeeGo to put it in the Nokia N8, but MeeGo wasn't ready so the N8 shipped with freaking Symbian. Nokia themselves admitted "technical difficulties" with MeeGo as the reason for the huge delays. By the time they had hammered around the issues, the investors were getting impatient and Elop happened. Android would have been yet another perpetually in beta no-show if it had been based on "real linux".


Nokia was shipping "tablet" products before Android even existed ;) The fact that they screwed up says more about Nokia management then anything else....

Reply Score: 8

RE[2]: Android did for Linux?
by oiaohm on Wed 4th Dec 2013 23:44 UTC in reply to "RE: Android did for Linux?"
oiaohm Member since:
2009-05-30

kurkosdr even google developers behind wakelocks end up openly seeing what the major defects are with them. Of course people like you also overlook that Motorola submitted quickwakeup around the same time and there was an existing framework in the Linux kernel(prior Linux mobile phone projects partly responsible for the pre existing.).

So it was not just the Linux Upstream. Android Makers were infighting with themselves how wakelocks should be implemented. Yes the flamefest was not just Linux community vs Google. It was Google vs Motorola vs some other Android makers vs the Linux Community. The good thing that came out of this was a stack of test cases.

http://www.elinux.org/Android_Power_Management Long and brutal.

https://lwn.net/Articles/479841/ Linux 3.5 in the end merges this all most all the features of both solutions and compatible with the already in kernel framework. Also passes all the failure cases that could ruin you with Google Wakelocks or Motorola quickwakeup or Suspend blockers.

Sorry based on real Linux does not have to be a no show. In fact it never was.

www.openmoko.org kurkosdr pre-dates the existence of Meego or Android. Openmoko is a Full GNU/Linux phone. X11 included. Released on time and functional.

Nokia was trying to make something Symbian application compatible with Meego.

Even with all the crud of X11 a mobile phone was more than possible.

So how can you say Android would have been yet another perpetually in beta no-show if it was real linux based. The fact google was making their own new framework method following the Openmoko kind of style it would have worked if it was not for another reason.

Android was designed to be less restrictively licensed. Google did a lot of things to avoid restrictively licensing some got them sued by Sun. The major barrier to openmoko Linux was the fear of GPL by phone makers at the time.

Sailfish OS the descendant of Meego is going forwards quite well now. Particularly that it does not have the compatibility conditions nokia was asking for.

Just go and look at what Nokia did to symbian. When the worked out Linux kernel could not be bent to emulate symbian they tried wrapping QT over it.

Nokia was not exactly playing by the rules with Meego either. Magical patch submissions without decent explanation why caused a lot to bounce.

Nokia big problem with Meego was attempting to make two incompatible designed OS's be able to use each other application source code without alteration. So costing them more time than if they had just rewrote particular applications from the start line.

Over all most of google android patches did merge. There were just a few key ones that did not.

Reply Score: 9

RE[3]: Android did for Linux?
by Slambert666 on Thu 5th Dec 2013 06:48 UTC in reply to "RE[2]: Android did for Linux?"
Slambert666 Member since:
2008-10-30

Yeah dude.
That diatribe is a unholy mix of rewriting history, strawman attacks an plain old making stuff up...

www.openmoko.org kurkosdr pre-dates the existence of Meego or Android. Openmoko is a Full GNU/Linux phone. X11 included. Released on time and functional.

Nokia was trying to make something Symbian application compatible with Meego.

Just go and look at what Nokia did to symbian. When the worked out Linux kernel could not be bent to emulate symbian they tried wrapping QT over it.

openmoko was not very good and excessively buggy compared to other systems at the time.

OP is completely correct in the assessment that had Nokia chosen to not cooperate with upstream the project would have been finished earlier and might have been successful.

Your argument that Nokia failed because they were trying to make the "linux kernel compatible with QT" is just plain strange...

Reply Score: 4

RE[4]: Android did for Linux?
by oiaohm on Thu 5th Dec 2013 07:31 UTC in reply to "RE[3]: Android did for Linux?"
oiaohm Member since:
2009-05-30

Your argument that Nokia failed because they were trying to make the "linux kernel compatible with QT" is just plain strange...


You have miss read. Nokia attempted to make Meego source compatible with symbian applications source code. This project turned into a complete fail. Made sections of Meego ABI stupidly complex.

When this failed Nokia attempted to move all symbian developers to Qt. Basically two completely disaster fails.

Most of Nokia delays with Meego were not upstream issues. It was attempting to be sideways compadible. Between Meego and Symbian.

openmoko funny enough go read the reviews it was not that buggy. Mostly because all applications on it had been built particular for it.

Android has the same thing all applications had been built particular for it.

Meego had these huge porting things. Or in other words how to get into a hole.

Nokia with Meego did not follow what Openmoko and Android did. This is why Nokia got into hell.

We have seen like lindows that had the same problem. Remember where it was going to use wine to port ever single application. Of course this turned into a nightmare from hell as well.

Reply Score: 2

RE[5]: Android did for Linux?
by leech on Thu 5th Dec 2013 15:48 UTC in reply to "RE[4]: Android did for Linux?"
leech Member since:
2006-01-10

Everyone so far is forgetting Maemo. The real reasons 'MeeGo' had so many issues were because it had so many different origins to it.

So basically Nokia went from Maemo being GTK based, to working with Intel to merge Maemo and Moblin into MeeGo and then pushing Qt as the basis (since Nokia had purchased Trolltech) and on top of that the aforementioned Symbian / MeeGo / Qt combination.

I know the version of Qt Creator for the Harmattan SDK has a build button for Windows/Mac/Linux/Symbian/Harmattan. Newer ones have Android and Blackberry added to it, and Jolla has modified it for their SailfishOS SDK to port to it. So for the most part the compatibility layer was working (at least as far as I can tell, I haven't gotten that far in my studies yet).

Another big kicker to the "MeeGo isn't ready yet!" is that full on MeeGo was rpm based (from Moblin) whereas MeeGo Harmattan as on the N9/N950 are deb based (From Maemo).

Android didn't use gtk or qt or even standard LibC. All they did was grab the Linux kernel.

There was a blog post that I can't seem to find again that talks about how much better the N9's power management is compared to Androids, due to the way the wakelocks are more dynamic or something.

Either way, Nokia definitely was trying to work more with the Linux developers, and their code managed to get into the kernel.

Reply Score: 3

RE[6]: Android did for Linux?
by shmerl on Fri 6th Dec 2013 18:04 UTC in reply to "RE[5]: Android did for Linux?"
shmerl Member since:
2010-06-08

The real reasons 'MeeGo' had so many issues were because it had so many different origins to it.


Rather it had many issues because the project didn't have a chance to fix them, before both Nokia and Intel ditched it. Any transition is disruptive, and Mameo switching to Meego was disruptive as well. GTK to Qt and dpkg to rpm were technicalities, but ones that required effort to integrate. While Meego was ditched, it was forked as Mer, and it uses Qt, rpm and the rest of the Meego heritage successfully.

Edited 2013-12-06 18:04 UTC

Reply Score: 2

RE[3]: Android did for Linux?
by diegoviola on Thu 5th Dec 2013 22:57 UTC in reply to "RE[2]: Android did for Linux?"
diegoviola Member since:
2006-08-15

Qt.

Get it right.

Reply Score: 3

RE[3]: Android did for Linux?
by zima on Tue 10th Dec 2013 00:31 UTC in reply to "RE[2]: Android did for Linux?"
zima Member since:
2005-07-06

Sailfish OS the descendant of Meego is going forwards quite well now.

Sailfish is beta quality at best, unstable ...and shipping. That didn't play out well for N9 Meego.

Reply Score: 2

Comment by shmerl
by shmerl on Wed 4th Dec 2013 19:55 UTC
shmerl
Member since:
2010-06-08

Valve will do for Linux gaming what Android did for Linux mobile.


I hope Valve's SteamOS will be better than Android on mobile. Valve didn't create a split and hopefully will be using a regular glibc Linux stack including the graphics (Wayland and etc.). So it will benefit all Linux distros at large in the long run. Android on the other hand created a hard split and benefits only itself.

Reply Score: 4

RE: Comment by shmerl
by 1c3d0g on Wed 4th Dec 2013 20:29 UTC in reply to "Comment by shmerl"
1c3d0g Member since:
2005-07-06

If the Linux folks can't even agree on something as basic as a f*cking graphics server (X, Wayland, Mir etc.), I say don't hold your breath. Let's not even get started on the audio stack.

Don't get me wrong, I like choices too, but at the end of the day, IF you want to be taken seriously and have success with your endeavours, one project must be chosen above all and all efforts should be focused on improving that particular choice, instead of the current bickering and infighting that's going on right now.

Reply Score: 2

RE[2]: Comment by shmerl
by woegjiub on Wed 4th Dec 2013 21:22 UTC in reply to "RE: Comment by shmerl"
woegjiub Member since:
2008-11-25

The Linux folks *are* in agreement, if you just ignore one company - Canonical.

X is to be succeeded by Wayland, due to inherent flaws.

SystemD modernises system administration and maintains modularity.

The main thing needed now is for Pulse to be able to usurp the use-cases of JACK and Pulse-less ALSA.


The one thing Canonical are doing right is switching to Qt5 with QML, but GTK3 isn't that bad, and it's not too different to MS's plethora of GUI options.

Reply Score: 6

RE[3]: Comment by shmerl
by vicdavery on Thu 5th Dec 2013 08:48 UTC in reply to "RE[2]: Comment by shmerl"
vicdavery Member since:
2012-11-30

Overall I do agree that the Linux audio stack is a mess, and embarrassing after all this time.

However, I would much rather see Pulse disappear to be replaced by JACK.
Pulse has caused nothing but problems since day 1. In my opinion JACK is far superior not just for latency but also flexibility.
I can't ever see Pulse being able to cover the low-latency area and supporting the Pro-Audio segment.

Reply Score: 2

RE[4]: Comment by shmerl
by woegjiub on Thu 5th Dec 2013 11:32 UTC in reply to "RE[3]: Comment by shmerl"
woegjiub Member since:
2008-11-25

Pulse does have more momentum, and more features for end-users as opposed to pro-audio (it makes it so easy to set per-application sound/stream audio etc.)

That's the main reason I thought it would be better as a candidate for consuming the other; it doesn't really matter which does it, there simply needs to be an overhaul of the stack, wayland/systemd-style.

Out with the old, and in with something that has been designed for all modern use-cases, with the knowledge of how a modern stack should be.
The effort will be gargantuan, but most people seem to acknowledge it's necessary, even with the vast improvements that have been made over how things were even half a decade ago.

Edited 2013-12-05 11:34 UTC

Reply Score: 2

RE[2]: Comment by shmerl
by oiaohm on Thu 5th Dec 2013 00:13 UTC in reply to "RE: Comment by shmerl"
oiaohm Member since:
2009-05-30

If the Linux folks can't even agree on something as basic as a f*cking graphics server (X, Wayland, Mir etc.), I say don't hold your breath. Let's not even get started on the audio stack.


Ok what etc. X11 is the exist due to age and design there is absolutely no question has to die. Wayland is from x.org project and is the designated successor to X11. Wayland has support of all bar 1 of the major Linux desktop environment and all major video card makers.

Mir/Unity/Ubuntu is the only break away. Has no support from anyone other than Ubuntu including video card makers. Odds of long term success low.

Now 1c3d0g what other etc's. There is nothing.

Lets move on to audio. You are aware back in 1996 there were 12 posix audio servers in use on Linux all incompatible with each other. I mean 100 percent incompatible. Today we are down to 3. The 3 remaining audio servers are pulseaudio and jackaudio and audio-flinger(android only) . All support very different needs. Pulseaudio and jackaudio have a cooperation interface.

Yes a cooperation interface between sound servers was unthinkable in 1996. Why the developers had your foolish logic. You must choose one equals you don't have to cooperate with other projects doing competition things.

1c3d0g there is very little real bickering and infighting really. Mir stuff is mostly that Ubuntu does not want to admit the path they have taken is mostly a no go.

One project chosen before all others does not happen in the FOSS world. What happens is the weaker slowly shows itself and gets crushed.

Pulseaudio is in fact a merge of tech from 6 different sound server projects.

1c3d0g think about it this way. Would it matter if we have 100 different sound servers and a 100 different graphical servers if applications developers only had to worry about 1 ABI/API that worked on them all. To have 1 universal ABI/API required cooperation and test-casing. Not this must choose 1 point of view. Cooperation like what happened with Pulseaudio might lead to project merging and reduction.

Yes I would like to take the heads of graphical and audio world on Linux and lock them in a room with no way out until they had universal consensus.

1c3d0g basically you have to stop repeating garbage. The Linux world is more in consensus today than it was 5 or 10 years ago. Ok the noise being generated is louder. The noise from the wayland camp is so loud because they don't want consensus disrupted.

Reply Score: 4

RE[3]: Comment by shmerl
by Brendan on Thu 5th Dec 2013 04:41 UTC in reply to "RE[2]: Comment by shmerl"
Brendan Member since:
2005-11-16

Hi,

These are (were?) symptoms of a larger problem.

It doesn't matter if its X vs. wayland, or KDE vs. Gnome, or Qt vs. GTK, or Python 2 vs. Python 3, or various different package managers, or changes in "/dev", or different init/runlevel scripts, or different CRON daemons or....

It's a loose collection of pieces that happen to work together sometimes (if distro maintainers throw massive quantities of effort into fixing all the breakage and most software handles the hassle of many different alternative dependencies); and it is not a single consistent set of pieces that were all designed to work together. There is nobody in charge that can say "this is the standard from now on", or "these are the APIs/libraries that will remain supported for the next n years", or "yes, this alternative is technically better but not enough to justify the compatibility problems it will cause and therefore that alternative will not be included in any distribution".

- Brendan

Reply Score: 4

RE[4]: Comment by shmerl
by Fergy on Thu 5th Dec 2013 06:00 UTC in reply to "RE[3]: Comment by shmerl"
Fergy Member since:
2006-04-10

[q]I hate linux, linux is stupid.

- Brendan
/q]

Reply Score: 2

RE[5]: Comment by shmerl
by lucas_maximus on Thu 5th Dec 2013 08:16 UTC in reply to "RE[4]: Comment by shmerl"
lucas_maximus Member since:
2009-08-18

Way to go Fergy ... what a great argument.

This is the problem with the Linux community, cannot and will not listen to any criticism no matter how valid (and no I am not spending my free time contributing when most projects are quite caustic).

Edited 2013-12-05 08:18 UTC

Reply Score: 1

RE[6]: Comment by shmerl
by Alfman on Thu 5th Dec 2013 09:11 UTC in reply to "RE[5]: Comment by shmerl"
Alfman Member since:
2011-01-28

lucas_maximus,

"This is the problem with the Linux community, cannot and will not listen to any criticism no matter how valid."

Well, some people don't like hearing criticism of the communities they belong to. However it's no more true for linux than for other communities. You get the same kind of flack for criticizing microsoft and apple.


For what it's worth, I think linux should adopt a stable ABI. I've given it a lot of thought and in my mind a good compromise exists by making the ABIs stable between minor versions and allowing them to break between major ones.

Keep in mind that even though the windows kernel maintains long term kernel API/ABI, microsoft has never the less broken many existing drivers using those stable APIs due to new kernel restrictions (especially with win vista/7). Even with win8 I discovered the dameware mirror driver broke from win7. So from a practical user point of view, windows users sometimes have to go through similar kernel driver breakages (regardless of the underlying cause).


So for a linux example: a driver could be compatible with specific major versions like linux 3.x. A new driver build will be needed for linux 4.x. This allows linux to have most of the benefits of stable interfaces. Additionally linux would not get stuck with long term cruft in legacy interfaces that no longer make sense or aren't optimal for exposing the features of new hardware. Hardware manufacturers could stop worrying about individual kernel builds, only the major ones.

I think this is a very reasonable approach, but alas I am not in charge.

Reply Score: 3

RE[7]: Comment by shmerl
by lucas_maximus on Thu 5th Dec 2013 09:30 UTC in reply to "RE[6]: Comment by shmerl"
lucas_maximus Member since:
2009-08-18

Tell Linus that.

Reply Score: 4

RE[7]: Comment by shmerl
by oiaohm on Thu 5th Dec 2013 10:35 UTC in reply to "RE[6]: Comment by shmerl"
oiaohm Member since:
2009-05-30

Lets take an Linux example of equally broken driver.
ATI in case when Linux Kernel Lock was removed. Alfman

Was it possible to get the ATI drivers back working again on Linux. The same binary blobs that were used in the past. Yes it was. Why there were interfaces for wrappers to be placed on top of the driver to allow for the new kernel limitation. Did this require AMD or ATI to fix the issue on abandoned hardware. No it did not.

This is the big problem. Windows 7 to Windows 8. How can you fix it to get your hardware back working.

The binary blob with source wrapper that Linux classes as it minimum acceptable driver allows drivers to operate longer. Even nvidia 8K requirement causing failures on kernel build with 4k pages also could be wrapped over if Nvidia never fixed the driver.

There is just case after case of where mostly binary blob Linux drivers have been able to be brought back to life by altering the wrapper code over it.

Hardware makers also don't worry about particular kernels with Linux that much. If 1 kernel version does not work they publish use a different one for a while.

The nvidia and AMD wrapped binary blobs are more compadible than just 3.x they in fact support 2.6 and 2.4 kernels as well. So yes all the way back to 2001.

Any API exported to userspace from kernel space on the stable list cannot be broken/fixed must remain functioning exactly how it use to.

The selection to forbid solid blob drivers keeps the kernel cleaner. If there is a issue with a blob with Linux the wrapper to fix it only has to be loaded if the driver requiring it is being used.

Before you can consider binary only drivers with Linux something else need to be done. Implement the means to re-link a binary blob. Why so wrappers can be inserted the way they currently are.

Yes your Windows 7 to windows 8 problem also comes about because Microsoft Windows driver tools don't have any way to relink a fix object into a existing driver.

Linus will not allow the change unless same quality of support for old unsupported by maker hardware can be maintained after the change.

Alfman the Linux kernel back in 2.2 did have binary drivers support. The feature was intentionally disabled. List of reasons.
1 gcc versions passing arguments differently.
2 Alteration in API in kernel space would leave no way to repair driver. So forcing either the kernel to grow in size with emulation parts or require binary drivers without wrapper to be for-bin.

If binutil ld had means to relink a binary with wrappers were required both 1 and 2 could be addressed.

Alfman like it or not the Linux stand on no binary drivers is grounded in operation and build tool limitations.

Reply Score: 2

RE[5]: Comment by shmerl
by Brendan on Fri 6th Dec 2013 07:15 UTC in reply to "RE[4]: Comment by shmerl"
Brendan Member since:
2005-11-16

Hi,

[q]I hate linux, linux is stupid.

- Brendan
/q]


More like, I've been using Linux for over 10 years and know that (especially for graphics) it is far from perfect.

- Brendan

Reply Score: 3

RE[4]: Comment by shmerl
by twitterfire on Thu 5th Dec 2013 07:49 UTC in reply to "RE[3]: Comment by shmerl"
twitterfire Member since:
2008-09-11

And let's not forget that Linux lacks a stabe API and a stable ABI. Writing software for Linux is chasing a moving target: what works today will be broken tomorrow.

I wonder why Valve didn't choose to use FreeBSD as a base, beside poorer hardware support.

Reply Score: 2

RE[2]: Comment by shmerl
by olejon on Fri 6th Dec 2013 20:55 UTC in reply to "RE: Comment by shmerl"
olejon Member since:
2012-08-12

If the Linux folks can't even agree on something as basic as a f*cking graphics server (X, Wayland, Mir etc.), I say don't hold your breath. Let's not even get started on the audio stack.

Don't get me wrong, I like choices too, but at the end of the day, IF you want to be taken seriously and have success with your endeavours, one project must be chosen above all and all efforts should be focused on improving that particular choice, instead of the current bickering and infighting that's going on right now.


The thing is, that Valve can decide the future of Linux. If they choose to only support Wayland, it will force everyone to use it. They have a lot of power, and it may benefit users and get rid of some of the unnecessary fragmentation.

Reply Score: 2

RE: Comment by shmerl
by Ultimatebadass on Wed 4th Dec 2013 20:33 UTC in reply to "Comment by shmerl"
Ultimatebadass Member since:
2006-01-08

Wayland


There would have to be GOOD (comparable to windows quality) Wayland graphic drivers for both AMD and nVidia (and intel, though those first 2 are more important) for that to happen.

SteamOS is not your average linux distro. It NEEDS to be good from the first release otherwise it's going to be a laughing stock. I hope valve understands that. They are trying to pull people away from windows gaming, if it sucks it's not going to happen.

Reply Score: 4

RE[2]: Comment by shmerl
by Novan_Leon on Wed 4th Dec 2013 20:52 UTC in reply to "RE: Comment by shmerl"
Novan_Leon Member since:
2005-12-07

I actually think the initial reception of the SteamOS/Steam Machine platform will be very poor.

Valve has always followed the slow-and-steady approach, but they always have a well-defined vision and the patience to stand behind their vision until it comes to fruition. Steam was originally considered a failure, a step-back, but after a couple years people began to warm up to it as Valve slowly-but-surely continued to improve it. I expect something similar to happen with the SteamOS/Steam Machines.

Edited 2013-12-04 20:54 UTC

Reply Score: 5

RE[3]: Comment by shmerl
by Ultimatebadass on Wed 4th Dec 2013 21:34 UTC in reply to "RE[2]: Comment by shmerl"
Ultimatebadass Member since:
2006-01-08

The difference is that steam in its infancy days had no real competition. In the os market there are already better alternatives for people to turn to, so if it fails to impress it's not going to last. We'll see, i'm anxious to see what kind of waves (if any) can it generate among AAA publishers.

Reply Score: 2

RE[3]: Comment by shmerl
by WereCatf on Wed 4th Dec 2013 22:28 UTC in reply to "RE[2]: Comment by shmerl"
WereCatf Member since:
2006-02-15

I actually think the initial reception of the SteamOS/Steam Machine platform will be very poor.


I have to agree. It'll probably at first only be adopted by Linux-supporters and those kinds of gamers who are already into tech and aren't afraid of using not-quite-ready stuff. It's not a bad place to start, though, since it means you're much more likely to get actually useful feedback, bug-fixes and whatnot in return without all the Average Janes and Joes overwhelming your support lines with nonsensical stuff. In my experience Linux is still severely lacking in all sorts of scenarios and it'll take time for Valve to come around and find a fix for it all, but the feedback may help them with focusing on the most important things first or help them find the most effective solution.

SteamOS and Steamboxes aren't meant as replacements for Windows or regular PCs, they're meant to supplement them, so Valve isn't really in any particular need to hit the ground running and can just focus on things in the long run.

Reply Score: 5

RE[4]: Comment by shmerl
by Ultimatebadass on Thu 5th Dec 2013 09:45 UTC in reply to "RE[3]: Comment by shmerl"
Ultimatebadass Member since:
2006-01-08

It's not a bad place to start


I'd argue that it is. This is why we're still waiting (well, not really) for that mythical "year of the desktop linux". Most people don't like using half-assed things, and being "free" is no argument for this particular target group.

Valve is a big-ish company, it's not some group of enthustiasts pushing out a new linux distro. Relying on customers for beta-testing is a shit way to handle things.

Reply Score: 3

RE[2]: Comment by shmerl
by zima on Sun 8th Dec 2013 07:00 UTC in reply to "RE: Comment by shmerl"
zima Member since:
2005-07-06

There would have to be GOOD (comparable to windows quality) Wayland graphic drivers for both AMD and nVidia (and intel, though those first 2 are more important)

Intel is probably more important than those two, Intel GFX is becoming "good enough" for more and more people.

Reply Score: 2

RE: Comment by shmerl
by CapEnt on Wed 4th Dec 2013 20:37 UTC in reply to "Comment by shmerl"
CapEnt Member since:
2005-12-18

Android had to split itself to thrive.

Back in 2009~2010 the Linux kernel, and several other components of the GNU ecosystem, was all but suitable for mobile (even today it still do not sorted off his power management issues in laptops, just to give a idea), and with the tight schedules of product shipping these days, there was no time to play the politics required to adapt all components and push back the patches to community before using.

Valve on other hand don't need to do any of that. The changes that Valve needs to to on Linux are minimal. His distro will be desktop oriented, all components needed for gaming development is already in place, and developing software on Linux these days is a bliss. The only thing lacking is a descent display server, but this is questionable. Perhaps better IDEs are needed, but this is a non-issue for many developers, more a matter of taste and development style.

Reply Score: 2

RE[2]: Comment by shmerl
by moondevil on Wed 4th Dec 2013 20:44 UTC in reply to "RE: Comment by shmerl"
moondevil Member since:
2005-07-08

And yet there were already quite a few handset manufactures using Linux based OSs, before Android was released to the world.

Reply Score: 5

RE[2]: Comment by shmerl
by shmerl on Wed 4th Dec 2013 21:21 UTC in reply to "RE: Comment by shmerl"
shmerl Member since:
2010-06-08

No, it didn't have to split to thrive. Android was created as a closed system, they didn't care about Linux community or any synergy with it. Then Google bought it and "opened" it somewhat. But the split remained.

Edited 2013-12-04 21:24 UTC

Reply Score: 4

RE[3]: Comment by shmerl
by cdude on Thu 5th Dec 2013 12:39 UTC in reply to "RE[2]: Comment by shmerl"
cdude Member since:
2008-09-21

You have similar "splits" all over. From distributions not using vanilla-Kernel as default to desktop to ... everything. This is not a bug but a feature. Allow, no support, all kind of people and groups to innovate on top and if something turns out to be useful work on getting the concepts upstream, back into mainline.

Android is a prime-example. It did modify vanilla, everybody does. It innovated successfully, new concepts, like improved power management. These concepts, read not just the patches, made it step by step back into vanilla.

This is why Linux beats every competition out there. Unlimited innovation, rework patches, make them even better, less dirty, cover more cases, with a focus for future innovation, and bring them into mainline.

The best innovation driven by requirements possible. And here comes another player, Valve, and does the same. They profit from all that. Steambox will eat lesser power cause of work done before with Android. Compared to Android we already know that Valve is innovating, driving new concepts and requirements in. Its going to be good for Valve and Linux mainline. Win, win, again and again.

Edited 2013-12-05 12:48 UTC

Reply Score: 5

RE[4]: Comment by shmerl
by shmerl on Fri 6th Dec 2013 00:37 UTC in reply to "RE[3]: Comment by shmerl"
shmerl Member since:
2010-06-08

It did a split on a very deep level - libc. That was bad in the long term. Of course, may be they intended to be separate to begin with, then my point is even stronger - Android is too isolationist.

Edited 2013-12-06 00:37 UTC

Reply Score: 2

RE[2]: Comment by shmerl
by Ithamar on Wed 4th Dec 2013 21:35 UTC in reply to "RE: Comment by shmerl"
Ithamar Member since:
2006-03-20

Android had to split itself to thrive.

Back in 2009~2010 the Linux kernel, and several other components of the GNU ecosystem, was all but suitable for mobile (even today it still do not sorted off his power management issues in laptops, just to give a idea), and with the tight schedules of product shipping these days, there was no time to play the politics required to adapt all components and push back the patches to community before using.


TomTom had been shipping plenty of mobile devices running Linux, and so did many other vendors, so this is large overstated.

Valve on other hand don't need to do any of that. The changes that Valve needs to to on Linux are minimal. His distro will be desktop oriented, all components needed for gaming development is already in place, and developing software on Linux these days is a bliss. The only thing lacking is a descent display server, but this is questionable. Perhaps better IDEs are needed, but this is a non-issue for many developers, more a matter of taste and development style.


Agreed with this though ;)

Reply Score: 3

Should be good news for everyone.
by Alfman on Wed 4th Dec 2013 22:10 UTC
Alfman
Member since:
2011-01-28

I would very much like them to succeed, it will give linux a huge boost on the desktop for gaming which everyone can benefit from either directly or indirectly through improved competition.

This debate about the forks is interesting. On the one hand forking is bad because it can split the community. On the other hand, those doing it are just exercising rights explicitly granted under the source code license. So it feels a bit ironic to point a finger at them while embracing the license that allows them to do it.

In any case, I don't particularly care if "steamos" is forked *to the extent that the games can still run on a normal linux desktop*. I think Valve is genuinely betting for Linux gaming to succeed as a whole. Fragmenting the small linux gaming market at this point would be completely counterproductive for their linux strategy, IMHO.

Reply Score: 3

ilovebeer Member since:
2011-08-08

Fragmenting the small linux gaming market at this point would be completely counterproductive for their linux strategy, IMHO.

Another way to look at that would be that there isn't much to lose. There are problems with Linux at every turn when it comes to gaming so the question is, do you really want that to be your starting point, or is it better to throw out the crap to redo it properly and move on?

Reply Score: 3

Alfman Member since:
2011-01-28

ilovebeer,

"Another way to look at that would be that there isn't much to lose."

I agree, there isn't much too loose, and there's everything to gain for valve. Games that are opengl are already going to run without too much porting effort, it's almost a gimme.


"There are problems with Linux at every turn when it comes to gaming so the question is, do you really want that to be your starting point, or is it better to throw out the crap to redo it properly and move on?"


I think this is greatly exaggerated. However even assuming there were such huge disparities in usability, none of this really applies to "normal" consumers who could buy the steam box as a ready to use gaming console. So long as they don't use poor hardware, it should run perfectly. Normal consumers won't be the least bit affected that it runs linux under the hood.

And besides, if not linux, then where else would you start? Valve has already indicated that Windows is a poor choice due to the lockdowns MS is imposing under metro. It would be foolish to give one of your primary competitors so much control over your product, don't you agree? Lest we forget Microsoft's history of abusive relationships with competitors on it's platforms.

Reply Score: 2

Looking forward to better audio
by jackastor on Wed 4th Dec 2013 22:20 UTC
jackastor
Member since:
2009-05-05

Looking forward to better audio support and standards.

Reply Score: 1

Life of Brian
by dennisma on Thu 5th Dec 2013 00:37 UTC
dennisma
Member since:
2013-12-05

Sometimes the bickering on this topic is akin to a scene from the Life of Brian.

I look forward to a Linux based SteamOS running my favorite games. Enuf said.

Reply Score: 2

RE: Life of Brian
by Digihooman on Fri 6th Dec 2013 22:22 UTC in reply to "Life of Brian"
Digihooman Member since:
2010-05-01

That was a scream of a movie too!

Reply Score: 1

modmans2ndcoming
Member since:
2005-11-09

The only way to get a large number of games built to play on Linux (SteamOS) is to build a good cross platform competitor to DirectX. It needs to be easy to use and high performance. Once they get it built they need Nvidia and AMD to include it in their chips.

Reply Score: 2

oiaohm Member since:
2009-05-30

The only way to get a large number of games built to play on Linux (SteamOS) is to build a good cross platform competitor to DirectX. It needs to be easy to use and high performance. Once they get it built they need Nvidia and AMD to include it in their chips.


Ok what are you smoking. What you are requesting done and been done for years.
http://www.phoronix.com/scan.php?page=news_item&px=MTQzNTA
SDL and Opengl cover Direct X. Wayland support is planed for SDL 2.0.1. SDL also provides all the wrappers to the Linux different sound systems and graphical.

Valve is funding SDL development and Steam Linux Runtime includes SDL. Also SDL is used by quite a few major games.

The problem on Linux has had bugger all todo with ABI. OpenGL drivers on Linux not supporting threading major performance hit. This is not a opengl issue either this has been a video card maker issue.

So all valve needs is video card makers to release decent drivers for Linux.

AMD's Mantle is also being talked about being platform neutral just not video card maker neutral.

Reality break here valve has todo nothing to make new high performance ABI's all valve has todo is create a market for AMD and Nvidia to fight over. Then let AMD and Nvidia do what they do best and select the Winner to be included in the Steam Linux Runtime.

AMD Nvidia and other parties making graphics cards are the major writers of Opengl.

Good api/abi starts with the video card maker not the other way around.

Reply Score: 4

Brendan Member since:
2005-11-16

Hi,

So all valve needs is video card makers to release decent drivers for Linux.


I wouldn't hold your breath. ATI and NVidia do try to provide drivers; but both the kernel developers and X developers are constantly breaking them (changing APIs). After years of having their work broken by morons (who can't create a standard and stick to it), I can't understand why ATI and NVidia haven't given up on Linux completely.

- Brendan

Reply Score: 0

moondevil Member since:
2005-07-08

Intel is even more funny.

They are supposedly the best contributor to Linux GPU drivers and X development.

Yet their OpenGL drivers are always behind their DirectX ones and the Linux ones are worse than their Windows ones.

For long time, their graphics performance analyzers were only targeted for Windows/DirectX developers. Situation that only changed when they started to support Android in x86 processors.

Talk about half-hearted contributions.

Reply Score: 4

tylerdurden Member since:
2009-03-17

Out of curiosity, Could you point the numerous and constant changes to the APIs in linux recently?

Reply Score: 2

oiaohm Member since:
2009-05-30

I wouldn't hold your breath. ATI and NVidia do try to provide drivers; but both the kernel developers and X developers are constantly breaking them (changing APIs). After years of having their work broken by morons (who can't create a standard and stick to it), I can't understand why ATI and NVidia haven't given up on Linux completely.


Brendan this is a complete lie. Linux Kernel breakages with Nvidia and ATI have in fact in all cases traced to them depending on behaviour that was not defined in the Stable API of the Linux kernel. Stable ABI is also exported to user-space. Functions exported to userspace making up the Stable ABI if they are every broken they will be fixed in a kernel revison. So no the Linux kernel cannot be in this list.

Nvidia and ATI have got into some trouble for bad coding behaviour. Like it was never good coding behaviour to just use the big kernel lock instead of creating your own. This busted ATI. Nvidia broken due to saying that page sizes will always be 8Kb. The standard api did not say either. In fact it said it was platform definable with a look up function that told you how big the current page size was.

The kernel side of the Nvidia and ATI drivers does not break that often. Yes and almost all cases have been something that should not have been done in the first place. There are functions in the linux kernel marked GPL only as well. These are not stable and are only fore drivers include as part of the main Linux kernel.

Nvidia is getting wiser with age. Like recently needing dma-buf making sure it was exported to user-space with a interface that would be stable.

X developers thinking nvidia designed bypasses to most of the X11 stack instead of fixing it.

The change of X11 API for drivers is in fact slower than Microsoft speed. Brendan so I do not get where you get this constantly changing api bit from. Look at the time frames of DRI 1 and DRI 2 and DRI 3. Please note they over lap with each other. For a very long time.

ABI changing is a lot more common.

Brendan X11 DRI driver compatibility in X11 is a 10 year thing for each version. DRI1 has only recently started being nuked. DRI1 drivers from 1998 still work on the 1 version of X11 where DRI1 will be removed.

Nvidia issue with X11 is hooking into functions that are not part of X11 DRI driver interfaces. Yes random-ally hooking into stuff is a way to get burnt.

Brendan yes the reason why Nvidia and ATI have not walked away from Linux is most of the trouble they have had is their own fault for not working with upstream and not using the upstream provided interfaces.

This has been the big problem most of the argument against Linux on drivers is bogus.

Reply Score: 4

twitterfire Member since:
2008-09-11

So AMD and Nvidia have bad coding behaviour and they hook on the wrong stuff. Why doeasn't that happen on Windows? It's because Windows has a stable API, a stable ABI and well docummented ones?

Reply Score: 5

oiaohm Member since:
2009-05-30

So AMD and Nvidia have bad coding behaviour and they hook on the wrong stuff. Why doeasn't that happen on Windows? It's because Windows has a stable API, a stable ABI and well docummented ones?

This is a lie. Reason why vista had so much trouble with Direct 9 drivers not working was in fact ATI and Nvidia had used non exported interfaces.

There was a time frame of very badly behaved video card driver makers.

So yes it did happen on Windows and Linux and OS X and Solaris.... Basically everywhere.

Reply Score: 2

Brendan Member since:
2005-11-16

Hi,

Brendan this is a complete lie. Linux Kernel breakages with Nvidia and ATI have in fact in all cases traced to them depending on behaviour that was not defined in the Stable API of the Linux kernel.


The stable API is for user-space, not for device drivers. There is no stable API that's useful for device drivers on Linux. To work around that both AMD and NVidia use a "shim". I've seen this break before (e.g. the shim relying on a kernel function that either ceased to exist or had its name changed); but what do you expect when there's no stable API for device drivers to begin with?

The kernel side of the Nvidia and ATI drivers does not break that often.


Ah, so you agree it does break.

Note that I didn't blame it all on the kernel alone. The graphics on Linux is a huge mess with different pieces responsible for different things (kernel, X, mesa, gallium, DRI, DRI2, Xrender); where responsibilities change over time (e.g. the introduction of KMS; the change from "nothing" to TTM to GEM, etc). To be fair we need to blame the entire huge mess (rather than just the piece/s of the mess that happen to be in the kernel).

Yes and almost all cases have been something that should not have been done in the first place. There are functions in the linux kernel marked GPL only as well. These are not stable and are only fore drivers include as part of the main Linux kernel.


Sure - functions marked "GPL only" with no alternative that native/binary drivers can rely on for the same functionality, leaving no choice other than to "do something that should not have been done in the first place".

Brendan X11 DRI driver compatibility in X11 is a 10 year thing for each version.


Sounds nice in theory. In practice there's a 75% chance that updating X will break your graphics driver or break your GUI or break something else; a 50% chance that you'll spend the entire afternoon searching for solutions all over the web, and a 35% chance that you'll end up downgrading again to fix the problem after wasting an entire day.

For an example, I'm using version 12.4 of ATI's drivers (newer versions of the drivers don't support my card). It works perfectly fine; except that newer versions of X11 don't support the older ATI drivers. This means that I haven't been able to update X11 for about 2 years. Now older versions of X11 have fallen off of Gentoo's packages and I'm screwed unless I switch to the open source ATI drivers. Of course I've tried the open source ATI drivers in the past and know they never work - the best I've managed with them is no 3D acceleration and only one monitor (where attempting to use a second monitor causes the system to crash).

Because I can't update X, I don't dare touch KDE either. It's far too likely that the newest versions of KDE rely on some fantastic extension or something that only newer X11 provides, and I'll end up with a broken system where KDE won't like old X, new X won't like old driver, and new driver won't like actual hardware.

Brendan yes the reason why Nvidia and ATI have not walked away from Linux is most of the trouble they have had is their own fault for not working with upstream and not using the upstream provided interfaces.


Sure, except "working with upstream" typically means "go screw yourself until you're willing to make all your driver's code open source", and still doesn't prevent Xorg from breaking things.

- Brendan

Reply Score: 3

oiaohm Member since:
2009-05-30

Brendan Introduction of KMS did not prevent non KMS drivers from working. Most people miss that you have options to boot Linux kernel up with KMS disabled.

I've seen this break before (e.g. the shim relying on a kernel function that either ceased to exist or had its name changed); but what do you expect when there's no stable API for device drivers to begin with?


Unfortunately the fault of this is ATI or AMD or NVidia. Every formal request to the Linux kernel to move a function to formally stable by ATI or AMD or Nvidia has been granted heck even some requests by video card makers no one would know. Every one of those that ceased to exist Nvidia, ATI or AMD had not done the request. Linux Kernel developers are not mind readers. Nvidia and ATI both complained that the did not like the overhead cost to provide a formal stable. So for long time were doing the wrong thing. In the last 5 years Nvidia has changed. Nvidia will simply refuse to support particular hardware combinations until the functions they need are moved onto the formal stable list today. AMD also has the same policy. Result no more of this problem.

This kernel space thing is basically thing of past.

Sure - functions marked "GPL only" with no alternative that native/binary drivers can rely on for the same functionality, leaving no choice other than to "do something that should not have been done in the first place".

The correct response here from a closed source driver developer is do formal request to stabilise a interface. So far this has never not been granted inside 6 months of request. Sometimes there has been an arguement over what should and should not be exposed. Support of any of these driver requested interfaces also goes into all currently supported kernels as equal to security updates.

So all this interface trouble you are talking about Brendan lands cleanly on the heads of the closed source driver developers. The main problem is that formally wrapped in the linux kernel for long term suitability adds a few milliseconds to the call. This is in fact unavoidable. Direct jumping into functions that are stabilised is not allowed. Most people don't know that you can tell the Linux kernel to pretend to be a particular version. To allow this possibility requires a redirection table. Redirection table is overhead. Even the Windows kernel has a redirection table for long term driver support. Yes there is a price for long term stable interfaces.

Brendan you will notice that Nvidia older don't break that often. Old ATI driver on the other hand did not use any of the interface specs. No DRI1 no DRI2 some form of random-ally hook where ever we like into X11.

Brendan AMD themselves are behind the open source drivers and that is the one the officially support.

Sounds nice in theory. In practice there's a 75% chance that updating X will break your graphics driver or break your GUI or break something else;

I have used Nvidia cards for the past 10 years. Last 5 not once as a X11 server update broken it. Mostly because Nvidia in the last 5 has been information X11/x,org project where they hook in. Yes before 5 years Nvidia did have issues they never told the x.org project where they were hooking in. Again developers are not mind readers they cannot avoid breaking what you are using if they don't know you are using it. Nvidia driver update screwing my system over yes I have had that. Where 2 Nvidia drivers installed at the same time completely shoot each other dead. This is not X11 or kernel or broken GUI. This is Nvidia being Nvidia and only allowing 1 copy of drivers installed.

I run kde. I can tell you anything past KDE 4.2.0 supports missing function mode and its not X11 server dependant as much as KDE 4.0.0 was. So you kde fear is not based in reality.

Sure, except "working with upstream" typically means "go screw yourself until you're willing to make all your driver's code open source", and still doesn't prevent Xorg from breaking things.

This is completely not true. If this was completely not true Nvidia Drivers would not be able to work as dependable as it does.

The driver you have having trouble with was pre AMD taking over ATI. In fact AMD is dropping it because internally its was not legally sound. AMD cannot keep on support it. Yes 12.2 and before fail legal department auditing for containing questionably sourced code. Yes when those cards drivers under Windows fail as well they will be dead over on Windows as well. Why AMD cannot update the highly questionable code without legal risks. This is why AMD had no choice to open source those cards. New drivers for windows for those cards if it ever happens will be based off the open source code base.

Also do you know what was removed when moving from X11 1.12 to 1.13 that broke the ATI drivers. UMS driver support being killed off. The predecessor to DRI 1. Yes this is right ATI had been writing drivers using interfaces older than DRI1. DRI1 is 1998.

Brendan so how is this Linux Kernel or X11 fault. ATI was writing drivers for X11 pre 1998. UMS starts in the 1980s. Your problem ATI was writing highly obsoletely drivers. Interfaces have 15 years of support. UMS is well past 15 years when it being killed off. Yes DRI1 is coming up to end of life. It is now 15 years old.

Brendan how far do you think you would get if I gave you a windows NT 4.0 or a Windows 98se driver and told you to use it with Windows 7 or 8. This is what you have been doing. Does this explain why you have been suffering.

Brendan this is the problem when you did into the problems most of the issues land squarely on the closed source driver maker for doing the wrong things. Some insanely wrong. ATI was insanely wrong with Legal issues and Obsolete design.

Brendan do you know what X11 break ABI/API policy is. I guess not.
1) its older than 15 years and marked deprecated for 4 year. So 19 years old min it can be deleted without question.
2) API/ABI under the age of rule 1 it can be broken to find out if they are in use if no one has reported it in use. If one bug report comes in that they are in use functionality must be restored exactly how it was.
3) No abi that X.org is informed of that is in use that is under under the age of rule 1 can be broken.

Sorry if you are in a location when you cannot skip one version of x.org and your driver works again. You are dealing with a driver that far legecay in design its not funny. Same with any program that does not work with X11.

Brendan this is why your arguement does not hold. You are mostly shifting blame to parties that are not responsible. Closed source driver makers have responsibilities todo the right things. The Open Source world is not being pains in but to them.

If open source developers were being pain in but drivers would break ever kernel release and every x11 release.

Reply Score: 2

Brendan Member since:
2005-11-16

Hi,

Oiaohm; you're trying your hardest to pretend that the sun shines out of open source developers butts - carefully choosing facts that suit your argument (and then stretching them as far as you think you can) and ignoring everything else. I don't know if you're stupid or dishonest, but I don't really care enough to find out which.

Oiaohm; my video cards are only about 5 years old (Radeon HD 4770, released in 2008). Initially the best I could get was unstable 2D (screen flickering black when scrolling, mouse pointer turning to trash occasionally, random crashes). Support improved over time and after 2 years video finally worked properly (including 3D acceleration, etc). Then I had about 6 months of drivers that actually worked before Xorg assholes broke it again. For comparison, there's a "Windows Vista" machine sitting next to it that is even older; where updating the video driver is a few mouse clicks with no chance of "update breakage" (and not a single bug or glitch from the start). It doesn't matter who you blame, it's not good enough.

Oiaohm; I don't care if most of the problems were bugs in X11 and not "Xorg policy". I don't care if it was AMD/ATI's fault that smaller OSs like FreeBSD and Solaris weren't able to support DRI1/DRI2/DRI3 quickly enough and therefore made it impractical for AMD/ATI to change their "intended for many OSs" driver over to something only Linux supported.

Oiaohm; I also don't care if everything has improved a lot recently or if it might work if I upgraded to recent X11 and open source ATI driver today. The fact is that after years of pain there's no way in hell I'm willing to do anything that might upset the "carefully balanced pile of turds" I've got and risk ending up knee deep in shit again.

Oiaohm; don't get me wrong - I'm not saying that all of Linux is bad (it's rock solid for most things), and I'm not considering using any other OS for servers or as a programming environment; however, I'd still rather waste $$$ on a stupid locked down X-Box (that we all know will be useless/obsolete junk in 4 years) than to attempt to get "free" software working for 3D gaming.

Also note, Oiaohm, that I've tried to put your name at least once in every paragraph in the hope that I sound like a patronising jerk. It seems to be fashionable..

- Brendan

Reply Score: 3

oiaohm Member since:
2009-05-30

Brendan Solaris and FreeBSD both had DRI1 support by the year 2000. So why was ATI still using pre DRI in there driver development in 2006. Cost cutting/stupidity.

So no it was ATI not supporting current. 2006 was AMD acquirement of ATI. Radeon HD 4770 tech development was 2006. There is bit of a lag between tech development and production release.

2006 it was also announced by AMD that the cards in the class your is would have to live with the Open Source driver only at some point.

Brendan basically the best thing that could happen was ATI be acquired. Fixing up the stack of garbage they left behind is not simple.

I choose a card that was support and support well by Linux.

You were informed in 2006 that support would end but you paid no attention.

September 4, 2008 is DRI 2 and Freebsd and solarias are picking up by 2011. So yes there is a 2 to 3 year lag.

Brendan the reality is you choose a card where the maker was not producing current drivers. Now are complaining when it does not work any more. Ok it does work now but with open source drivers that are still missing a few feature. The maker is working on restoring all those feature.

Really are you kidding me with Windows Visa and no possibility of breakage. Try running some of the non proper supported video card in it you know the ones that force you to use direct x 9 drivers.

Brendan there is linux/windows compatible hardware and Linux incompatible hardware. This is not a Linux thing.

Did you build that machine to be a Linux machine. I would say no. Because in 2006-2008 building a Linux machine would have equaled using a Nvidia video card not an ATI one due to the crap poor support ATI drivers were.

My Nvidia card is a Geforce 6. My card is older than your. Has had way less issues.2004-2005 card.

I will get 10 years of operational life out the Geforce 6 with min issues before I have to replace it. Nvidia did promise this.

Reply Score: 2

Alfman Member since:
2011-01-28

Brendan,

"Also note, Oiaohm, that I've tried to put your name at least once in every paragraph in the hope that I sound like a patronising jerk. It seems to be fashionable.."

Haha, this is so true it really made me laugh!


The real irony is that earlier in these comments I was arguing that the over-zealousness of linux community members was an overstated generalization, and yet here we have a poster who makes an incredibly solid case for the argument lucas_maximus was making.

I like linux and want to promote more widespread adoption. However even for me it's a real turn off when when someone is as stubborn as an ass and refuses to consider the needs of the community as a whole. There's always room for improvement and I'm confident that linux is improving all the time. But the extreme arrogance of some individuals is very discouraging to those who want join the linux community and make it better. At least I know they are in the minority, but they are still giving linux the reputation that lucas_maximus was referring to.

Edited 2013-12-07 06:10 UTC

Reply Score: 3

modmans2ndcoming Member since:
2005-11-09

What are you smoking? I said COMPETES.

OpenGL sucks to work with. Look at the number of games that work with SteamOS. it is far smaller than the number that work with windows. The reason is DirectX.

Yes, if Mantle becomes popular then NVidia will release their own version and developers will just focus on those two APIs and we can just expect Linux drivers to be created by AMD and Nvidia. In that case Valve needs to do nothing.

Reply Score: 2

twitterfire Member since:
2008-09-11

You must be joking. Tell me some reasons OpenGL sucks to work with.

Reply Score: 2

Novan_Leon Member since:
2005-12-07

All the smoking and joking aside, the important thing is that Valve already recognizes the deficiencies of Linux/OpenGL compared to Windows/DirectX and is still willing to throw their (considerable) weight behind the effort to minimize/eliminate these deficiencies. Valve isn't the type of company to do this flippantly. Regardless of the solution, they will almost certainly address these issues while simultaneously taking advantage of Linux/OpenGL's inherent strengths.

Edited 2013-12-05 14:50 UTC

Reply Score: 3

oiaohm Member since:
2009-05-30

OpenGL sucks to work with. Look at the number of games that work with SteamOS. it is far smaller than the number that work with windows. The reason is DirectX.


What do you think is the base of the Wii and the Playstations. That is right Opengl. What about all those Android games Opengl again. Those platforms don't have Direct X. In fact there are still a lot of games in steam that do have OpenGL support that are not ported off windows yet.

SteamOS and Steam on Linux is a newish platform. There are more games out there with a Opengl Base than there are Direct X games.

Opengl does compete with Direct X quite successfully.

Sorry to say you have a completely invalid view of the world. Direct X dominance on Windows is nothing more than an abnormality.

Reply Score: 3

zima Member since:
2005-07-06

What do you think is the base of the Wii and the Playstations. That is right Opengl.

OpenGL is not used in serious PS3 titles, they all use PSGL. Similar with Wii - any serious game talks more directly to the hardware.

Reply Score: 2

What kind of distro is SteamOS anyway?
by xfce_fanboy on Thu 5th Dec 2013 19:29 UTC
xfce_fanboy
Member since:
2013-04-09

Has Valve released any details on SteamOS yet? My initial assumption was that it'd be Ubuntu-based, since Ubuntu was the first distro to get the Steam client. But I'm starting to suspect that Valve may be building a distro from the ground-up that's optimized for gaming.

As many commenters have noted, things will get very interesting once Ubuntu starts pushing Mir while Red Hat and others want to advance Wayland. Valve will have to make a decision in favor of one or the other.

Reply Score: 2