Over the years, we’ve seen a good number of interfaces used for computer monitors, TVs, LCD panels and other all-things-display purposes. We’ve lived through VGA and the large variety of analog interfaces that preceded it, then DVI, HDMI, and at some point, we’ve started getting devices with DisplayPort support. So you might think it’s more of the same. However, I’d like to tell you that you probably should pay more attention to DisplayPort – it’s an interface powerful in a way that we haven’t seen before.
↫ Arya Voronova at HackADay
DisplayPort is a better user experience in every way compared to HDMI. I am so, so sad that HDMI has won out in the consumer electronics space, with all of its countless anti-user features as detailed in the linked article. I refuse to use HDMI when DisplayPort is available, so all of my computers’ displays are hooked up over DP. Whenever I did try to use HDMI, I always ran into issues with resolution, refresh rates, improper monitor detection, and go knows what else. Plug in a DP cable, and everything always just works.
Sadly, in consumer electronics, DisplayPort isn’t all that common. Game consoles, Hi-Fi audio, televisions, and so on, all push HDMI hard and often don’t offer a DisplayPort option at all. It takes me back to the early-to-late 2000s, when my entire audio setup was hooked up using optical cables, simply because I was a MiniDisc user and had accepted the gospel of optical cables. Back then, too, I refused to buy or use anything that used unwieldy analog cables. Mind you, this had nothing to do with audio quality – it was a usability thing.
If anyone is aware of home audio devices and televisions that do offer DisplayPort, feel free to jump into the comments.
Except for the part where, when a displayport display goes into power-saving mode / sleep, the specification requires it to send a hardware unplug event to the computer, which tends to result in display configs and window placement getting messed up. Workarounds like display sink dongles are about the only way around it, because most DP display makers don’t include an option to disable disconnect notifications.
Well, letting the source device know it shouldn’t bother building and sending video and sound signals anymore sounds like a pretty good idea.
And messing with your display configuration sounds like an OS issue..
What the OS is supposed to do with the windows located on the screens that get “disconnected” (entering sleep mode) ?
If it is just a single display configuration:
Just stop the output but leave the frame-buffer untouched.
In a multi-monitor setup:
show a requester and ask the user what to do (with the option to behave always like that).
Um, when the display wakes up, it reconnects, so the OS should restore the previous window placement.
KDE’s kwin definitely does this, as does MacOS. Definitely an OS issue.
I use 2 external screens over DP, and have never had a window unexpectedly move after a display suspended.
Then again, I haven’t used windows since Win7.
The OS is doing what it should be doing.
I am 100% with you on this Thom; I really don’t like the limitations HDMI imposes especially when dealing with modern GPUs and monitors. If there’s a DP option at all I go for it. My monitor at home can do 2560×1440 at 170Hz refresh, but only if I use DP; HDMI on that monitor tops out at 120Hz. I’ve found this is the case on many different monitors, even those made in the past year (and mine is only two years old).
The DisplayPort spec also ensures I can connect any kind of adapter — HDMI, DVI, VGA — and have it work with hardware that doesn’t support DP itself.
> My monitor at home can do 2560×1440 at 170Hz refresh, but only if I use DP
You’re lucky that you only have this problem 🙂 I’ve had hdmi monitors be detected as TVs and go YCbCr with the assorted color loss and distortion.
Right now I have a monitor over hdmi that at least is seen as a monitor… but it goes black for a few seconds every couple hours. I suppose that’s some idiotic HDCP renegotiation.
torp
Hmm, I notice color differences as well, but honestly I always wrote it down to monitor differences and it didn’t occur to me the gamut might be wrong.
My biggest problem connecting to a samsung HDMI TV is that it’s a damn “smart tv” that gets in the way of everything. I can’t simply open HDMI sources, the TV must go through a detection process that is a whole whole ordeal. Moreover every time I connect the computer it has to be redone. It’s maddening how regressive smart TVs are for such basic functionality. Perhaps it was a mistake to wait so long to buy a new TV, as older TVs may have been less smart and therefor better.
Well… I started using DP last year. Since my two displays are old (old LCD display + a 39inch LCD TV), I use a HDMI->DVI adapter on one and HDMI on another. Sometimes, when I have to use another device that only has DP output, I have to use an DP->HDMI adapter. I still didn’t figured out why, but, there are devices that I get no signal from the adapter. So I’ve been cursing DP outputs since then. Some work, some don’t. Guess I’ll have to buy another display just for these cases when the DP->HDMI have issues.
In my experience, DisplayPort++ adapters (the technical name for those little “level shifter that tells the GPU to speak VGA, DVI, or HDMI over the DisplayPort connector” dongles) have hit-or-miss compatibility.
Try a different model and you’ll probably get a different result. (I’ve had good results with Benfei DP→HDMI and DP→DVI adapters.)
Also, while I don’t know if it’d be relevant to your case, another possibility that I ran into with an HDMI KVM switch was that I had to use a pass-through HDMI dummy plug on the PC side to grab and persist the EDID from the monitor because the GPU didn’t like something about the way the KVM switch handled EDID.
the thing I find funny about optical vs. electrical S/PDIF is the signal is identical, you can translate from electrical to optical with an LED. The only real benefit is the optical version breaks any ground loops which can cause noise in the analog section of the device.
DisplayPort doesn’t support transporting Dolby Atmos in any way, shape, or form (not even the lossy variant aka Dolby Digital Atmos), it also doesn’t support LPCM 5.1.4, which is the minimum viable channel configuration for decoded Dolby Atmos (Displayport tops out at 8 LPCM channels).
It also doesn’t support ARC/eARC, which makes soundbars with a solitary digital input impossible.
Before that, it didn’t support CEC (now it kind of does via a tunneling scheme).
There is a reason DisplayPort has lost the consumer electronics industry, and it’s because the DisplayPort bros don’t care the slightest bit about the living room.
Thom, you may want to revise blanket statements such as “DisplayPort is better than HDMI, and I will die on this hill”.
CEC is and always was crap. Not supporting it would be a feature actually.
But all else yes, they are deal breakers for home theater setups.
Doesn’t matter, TV manufacturers want to put “supports CEC” on the feature list of their TV without having to asterisk it and explain that it doesn’t apply for DisplayPort, or get support calls about it. Same with Dolby Atmos today.
From the perspective of TV manufacturers, a DisplayPort port is a gimped port which they have no legacy reasons to include in their product (like they used to include SCART and RCA ports).
kurkosdr,
I’m with you on this one. HDMI has clear advantages.
But that is not entirely VESA Forum’s fault (part of it is of course). DisplayPort is a royalty free standard, and licensors like Dolby will never like it.
Look at Xbox. They have Dolby Vision support since Xbox One era (10+ years?). But they are limited in where they can use it. Same with Dolby Atmos.
(Sony only added Dolby Atmos to PS5 after their own in house experiment failed).
Want to use Atmos on my speakers? Sure, LG and Sonos supports that.
What about my headphones? That would be $10 extra!
Dolby Vision for gaming? Go ahead
Dolby Vision for 4K UHD BluRay? Sorry, Microsoft has not paid their extra toll, even even though the format is supported, it won’t work.
But the package on the disc and my Xbox says it is supported? Don’t care, it was extra!
These are terrible policies held by what is essentially a natural monopoly on the format.
(And before anyone says anything, a proper Dolby Atmos and Dolby Vision is miles ahead of competition).
They could at minimum support LPCM 5.1.4 and LPCM 7.1.4 audio so at least decoded Dolby Atmos (and DTS-X) will work for the most common speaker arrangements, but nope, they can’t be arsed, 7.1 or 8.0 is all you can get on DisplayPort. And they also don’t support eARC passthrough.
Also, DisplayPort and HDMI only do passthrough, they don’t have to decode anything, DisplayPort supports the once-royalty-encumbered Dolby Digital/AC3 and DTS.
kurkosdr,
eARC adds unnecessary complexity as the current DisplayPort design will pass simple signals in one direction (Computer -> Monitor)….
…. Well, a bit more complicated as there are things like daisy-chaining, but they generally try to keep this clean.
TVs are however connected to the sound systems with a separate, dedicated HDMI cable. That would mean there would be one more cable coming out of the PC or the monitor.
Why design a completely separate function if it has no current demand? As there is exactly zero audio sinks that use DisplayPort, but there are plenty that uses HDMI.
Anyway…
I think while these two are fighting we might go to an entirely different direction: USB Type-C
All I’ll say is “Fuck HDCP”.
FYI, HDCP 1.3 is an optional part of DisplayPort specification since DisplayPort 1.1 (ratified back in 2008), and HDCP 2.2 is an optional part of the DisplayPort spec since DisplayPort 1.3 (ratified back in 2014). High-quality DisplayPort monitors do support HDCP.
You see, HDCP is not an HDMI-specific thing, it’s a Hollywood thing. Hollywood studios require that HD signals must travel through the cable with at least HDCP 1.3 encryption, and that 4K signals must travel through the cable encrypted with at least HDCP 2.2.
No, it doesn’t matter if the HDCP 1.3 has been cracked to bits and that HDCP 2.2 can be legally downgraded to HDCP 1.3 with a simple adapter (even in the US), most people don’t have an “HDCP stripper” device, so HDCP stops common hardware such as HDMI capture cards.
If anything, HDMI is more user-friendly, because the HDMI logo on “sink” devices implies HDCP support, while on DisplayPort you may or may not get HDCP support.
Yup – this I know. So I’ll update my comment to “Fuck HDCP & Fuck Hollywood/MIAA”. These bastards made my life more complex and my devices more useless. That’s not a “tangible user benefit” what-so-ever. Frankly, they can go kiss and arse. You’ll prise VGA from my cold dead hand! (Joking!).
I think you mean “Fuck HDCP & Fuck Hollywood/Music and Film Industry Association of America”. Didn’t you hear that the RIAA and MPAA became the MAFIAA long ago?
At least you get a cable, 8K doesn’t even get a cable, Hollywood 8K content is delivered via TV-embedded apps only. Not that 8K is useful in any way, but it shows that without HDCP, you wouldn’t even get a cable, just TV-embedded apps and TV-embedded Blu-ray players.
Hollywood makes the rules unfortunately.
PS: The “tangible user benefit” referred to the HDMI logo, not HDCP, when you see the HDMI logo on a “sink” device, you know it supports HDCP.
Urrgghh… Enshittified. But then I’ll probably *never* need 8K content. We have a few 8K TVs at work for testing – fairly big ones (55, 64 etc) and so far I really can’t tell the difference. I’ve had a 5K screen since 2014 (original 5K iMac) and, well, my eyes really can’t see much of a difference.
ppp,
This gets a +1 from me, but…consumers don’t usually get a say It’s sad but often unavoidable.
You’re best bet might be to use a cracked device to strip HDCP from the signal. You end up with more latency and using more electricity but…it is what it is.
Urgh… the first time I hit this was years ago. I moved into a flat and only had the cable STB and a monitor (everything else in boxes etc). I figured I could just plug the monitor into the STB right? You know what happened next. Bastards.
Well, it is all true in theory. If they want it to be true in practice, they should hurry up and specify ARC and CEC protocols for DisplayPort, having the ability to add them isnt worth much when they havent added the 10+ years later.
At my job, I manage over 100 computers running Windows. Connection to screens pass through a docking station, and a HDMI cable or a DisplayPort cable can be used.
I was told to use HDMI instead of DisplayPort, and it was what I did.
Guess what? Many users complained about the screen. I remember only one screen found defective and being returned to the seller, honoring the warranty; in the remaining screens the problem was solved by simply using the DisplayPort connection.
I am with Tom. DisplayPort works flawlessly, while HDMI increases my workload.
dariapra,
So HDMI gives you more job security, haha.
I have a real job that allows me to pay my bills and two mortgages. Hence, I have no time to waste with annoyances like those screen connections that sometimes will work and sometimes will not.
I guess that somebody like you, who does not know what a real job is, I mean, one of these jobs in which you are expected to deliver, that is, getting things done, will find difficult to understand my point of view.
dariapra,
Ouch. The HDMI job security comment was a joke and not a personal attack. if you can find time in your busy schedule you may need to re-calibrate your sarcasm detector 🙂
Maybe the reason displayPort works better is exactly because it is not as popular than Hdmi?
You know, when you have to support all kind of devices, mostly more consumer oriented than computers, that the user will NOT adjust resolution and frequency at all, things get a bit missier than when you have to worry to more devices than just videocards and computer displays.
I always remember Mac (and I mean even pre-OSX) was better than PC, because it was such a closed ecosystem, you have not to worry about not 100% compatible stuff.