A considerable amount of people assume Wayland isn’t particularly suitable for gaming, usually because you can’t turn off the compositor. This post will challenge that assumption and see how the current state of gaming on Wayland is, with a focus on KWin, KDEs compositor.
A very in-depth look at how Wayland works for gaming – from input lag to rendering – compared to X, including latency benchmarks.
One of the best things Microsoft did was invest early in technical writers who produced fantastic API documentation and books. Since Microsoft sacked them all and turned their “F1” help systems to junk and sacked all their quality assurance people things are not as they used to be. Linux documentation is usually terrible and the material I’ve seen so far on Wayland is mostly an in-crowd with a history talking among themselves which hasn’t been very accessible. This explanation while not the easiest or best read and not making enough use of diagrams is the kind of direction I think things should go in.
Better documentation helps give people an easier introduction and also helps transferable knowledge kick in easier which also helps encourage discussion as well as giving journalists and pundits something to talk about. That’s one of the key advantages of good documentation and books and in turn provides a better entry point to build something educational on top of and adds to positive marketing.
Microsoft and Apple and Google et al get loads of free marketing which they don’t always deserve in part because documentation makes it easy. Another thing which gets them undeserved free marketing is adding features or tweaking which can lead into hype for the sake of hype and marketing dictating the direction of travel which is not always a good thing.
Isn’t trying to get more video performance on the machine that is running the software (game) the whole point of Wayland?
If it couldn’t at least do that better then just bin it now, sunk cost be damned.
The main point of Wayland was never performance, it was getting rid of X.org, which is needless complexity (considering the things it was designed to handle are now handled elsewhere) and which acts as a layer that only causes issues (for example lack of V-sync).
For example, the proprietary Nvidia driver replaces about one-third of X.org code just to be able to work.
kurkosdr,
I agree, existing opengl applications probably won’t see many gains because the opengl paths are already optmized. The main incentive is to replace the very old graphics stack, which I’m ok with because X is really bloated and hasn’t aged great. Ultimately the details don’t matter to users as long as it works well. Xwayland is good but some things remain broken and I’m not convinced they’ll all get fixed because the wayland dev’s don’t seem to be all that bothered that they’re broken. I question some of wayland’s decisions like screen capturing being compositor-specific, which is weird but that’s what they’re going with.
We’ll eventually get over the hump where staying with X will be more problematic than using wayland due to new applications requiring wayland.
Yep, Wayland is a lot of work to give us something nearly as good. Because it’s .. um .. better?
I’m guessing most of us have been on similar projects.
I am expecting Wayland with interest, since it’s Desktop Linux’s last attempt at survival. My work laptop, which runs X.org, has V-sync issues when playing full-screen video and other problems such as windows opening in the wrong monitor. If Wayland is the clean start we are told it is, it might just allow Desktop Linux to be able to claim status as an option (note: the battle for mainstream relevance ended when FullHD video started demanding protected video paths, now Desktop Linux is fighting to not become solely a subsystem running inside Windows but still have some standalone installations in developer laptops).
I don’t see how full X11 compatibility for XWayland is important. Modern DEs have abandoned strict adherence to the X11 spec (it’s why you have to use CDE if you want strict X11 adherence). Unless I am missing something.
kurkosdr,
Yea for all of X’s extensions, I don’t believe there’s on for Vsync, which is frustrating. I don’t recall exactly, but I think the only way to do proper vsync is through a separate ioctl.
http://www.landley.net/kdocs/htmldocs/drm.html#drm-vertical-blank
It’s just a matter of application compatibility. Most applications don’t need much more than a blitable screen buffer to work and xwayland should be transparent for them. The areas where I’ve gotten bitten are screen sharing apps, which can be pretty important for telework/virtual conferencing with some telling users to disable wayland. Either the capability needs to be added to xwayland, or the applications have be modified to talk to wayland’s compositor using new interfaces that still weren’t stable the last time I tried.
https://pipewire.org/
There are other deliberate restrictions in wayland, like the inability to programmatically place windows, which won’t affect most apps but can break certain kinds of dialogs/popup windows. I believe applications are going to have to switch APIs rather than being able to count on backwards compatibility.
My work laptop, which run Windows 10, has issues with windows opening on the wrong screen when in multi-monitor mode.
My home office desktop, which runs an old bog standard Mint X.org desktop on multiple monitors, has no such issues.
FullHD and higher videos play fine. I don’t play graphics intensive software on either, so I cannot personally comment on whether X11 has VSync or other stress issues. But I thought it must have those types of issues, otherwise why Wayland? I doubt it is just to break remote apps, which I use (please don’t suggest remote desktops, because that is not what I use or want).
The peak of Linux desktop was about a decade ago. Around the time we started actively destroying it with KDE4, Unity, Gnome3 and, yes, Wayland (although, to be fair, Wayland’s impact was mostly felt in the way resources have been redirected away from useful work).
Sure, in another 10 years we will be back where we started, by which time we will all probably move to cloud or mobile apps anyway. First rule of software development: platform adoption is far more important than any technical features, so *never* throw it away, regardless of how boring or messy it has become.