Linked by Thom Holwerda on Tue 21st Nov 2017 16:03 UTC
Windows

I wiped off my Windows 10 installation today. It wasn't because of the intrusive telemetry or the ads in the start menu but desktop composition. It adds some slight but noticeable latency that makes typing feel uncomfortable. In Windows 7 you can turn it off.

If you're fine with unresponsive UI operations and graphical tearing, then, sure, go back to Windows 7 or earlier and turn off compositing to get a few ms back when typing.

Thread beginning with comment 651313
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE[2]: Linux + xOrg
by Bill Shooter of Bul on Sun 26th Nov 2017 05:31 UTC in reply to "RE: Linux + xOrg"
Bill Shooter of Bul
Member since:
2006-07-14

Crazy... Says more of you than anyone else.

Basically its a tradeoff. Either you have screen tearing or increased latency. Most people prefer not having screen tearing. This input lag thing is kind of nuts. Maybe you can notice it, but we're starting to get into pseudo audiophile territory here. If you start telling me it can't be rigorously tested, then I'll have all the proof needed...

Seriously would be interested in a set up that had vsync on and off with a xorg windows system and run users through various tasks to see which set up they preferred, and if they could actually detect the latency. I'd completely accept the results of a proper study regardless of my initial assumptions.

Reply Parent Score: 2

RE[3]: Linux + xOrg
by ssokolow on Sun 26th Nov 2017 13:57 in reply to "RE[2]: Linux + xOrg"
ssokolow Member since:
2010-01-21

Thankfully, adaptive sync solutions like FreeSync 2 and G-Sync should eventually reduce the trade-off to irrelevancy.

(I wouldn't consider G-Sync a valid contender in the long run compared to an open standard like the optional adaptive sync portion of the DisplayPort 1.2a spec that FreeSync relies on, but G-Sync apparently doesn't require the proprietary scaler module in laptops since the display is connected more directly to the GPU.)

Edited 2017-11-26 13:58 UTC

Reply Parent Score: 2

RE[3]: Linux + xOrg
by Megol on Sun 26th Nov 2017 19:04 in reply to "RE[2]: Linux + xOrg"
Megol Member since:
2011-04-11

"Crazy... Says more of you than anyone else.

Basically its a tradeoff. Either you have screen tearing or increased latency. Most people prefer not having screen tearing. This input lag thing is kind of nuts. Maybe you can notice it, but we're starting to get into pseudo audiophile territory here. If you start telling me it can't be rigorously tested, then I'll have all the proof needed...
"

Ah, you actually know what you are talking about! Have to apologize as I thought you were yet another complainer without any clue. Well except for the "crazy" thing, still applies.

Today we have double buffered and synchronized drawing. It doesn't show tearing but it have a worst case latency approaching 2x the update rate of the screen.

IOW we have in a normal system a latency of 1/60 to 1/(2*60) seconds for a buffered, synchronized update. That is ~16.7 to ~33.4 milliseconds.

But that is only the raw latency ignoring any input -> processing -> update overheads.

This is obviously detectable by a normal human being and irritating for those that have better response time like FPS gamers.

Tearing while distracting isn't normally detectable. The most visible cases are those of moving chunks of screen data like when scrolling or moving windows.

But there are techniques for reducing those cases. Or was. It's probably harder to avoid it when targeting decoupled streaming optimized processors (=GPU).

One easy technique is optimizing for what the user is most likely looking at - text being entered or the mouse pointer. Prioritizing updates close to where the user is looking makes tearing artifacts less visible.

My main point is this: no vsync can result in visible tearing but generally lowers the perceived input->output latency. Why would it be crazy if someone prefers the possibility of tearing over a detectable additional latency? It's just a matter of preference.


Seriously would be interested in a set up that had vsync on and off with a xorg windows system and run users through various tasks to see which set up they preferred, and if they could actually detect the latency. I'd completely accept the results of a proper study regardless of my initial assumptions.


That would be interesting.

Reply Parent Score: 2

RE[4]: Linux + xOrg
by zima on Mon 27th Nov 2017 13:55 in reply to "RE[3]: Linux + xOrg"
zima Member since:
2005-07-06

This is obviously detectable by a normal human being and irritating for those that have better response time like FPS gamers.

Or so they claim... but one has to wonder how many are placebophiles (the author of the discussed news article certainly seems to be one) - for example, they might be noticing, when v-sync is disabled, not the decreased latency but tearing, and "convince" themselves it's better that way because it "should be" (so for a proper ABX test, you'd have to introduce "artificial" tearing when v-sync is enabled, to get around such easily noticed visual cue)

Reply Parent Score: 2