You could put it this way – DisplayPort has all the capabilities of interfaces like HDMI, but implemented in a better way, without legacy cruft, and with a number of features that take advantage of the DisplayPort’s sturdier architecture. As a result of this, DisplayPort isn’t just in external monitors, but also laptop internal displays, USB-C port display support, docking stations, and Thunderbolt of all flavors. If you own a display-capable docking station for your laptop, be it classic style multi-pin dock or USB-C, DisplayPort is highly likely to be involved, and even your smartphone might just support DisplayPort over USB-C these days.
Back when I bought my current 144Hz 1440p monitor with G-Sync for my gaming PC, DisplayPort was the only way to hook it all up, since HDMI wasn’t yet supported. Ever since, out of a baseless sense of caution, I’ve always preferred DisplayPort for all my PC video connection needs, and as it turns out – yes, DisplayPort is definitely better than HDMI, and this article will tell you why.
Actual vid of what this looks like
v1.0 https://www.youtube.com/watch?v=tUeTkyTmRb4
v2.0 https://www.youtube.com/watch?v=TK9yTiKI2cw
crazy stuff!
This article is pretty one-sided and glosses over many important details:
1. HDMI is guaranteed to support HDCP in all “sink” devices (aka TVs and monitors), with the “HDMI” trademark used to control compliance, while DisplayPort isn’t. This is critical because some set-top boxes will silently downscale to 480p or 720p if they discover your TV or monitor doesn’t support HDCP, and you’ll never know unless you run test patterns. This alone makes DisplayPort problematic in TVs since it’s hard to know that you don’t have a crippled HDCP-less port.
2. Converting DisplayPort to HDMI while maintaining HDR requires an active adapter.
3. HDMI has features like ARC and eARC that allows you to pass audio to a Dolby Atmos receiver/home theater system/soundbar, DisplayPort can’t do that at all.
Personally, I want HDMI ports on the TV (since the TVs capabilities are built around the HDMI standard they support so you aren’t gaining anything with DisplayPort), but a DisplayPort on an output device is always welcome (since I consider it a type of multiport) as long as an HDMI also exists.
There is one exception though: Stereoscopic 3D
HDMI 1.2 supports only up to 1080p60, which means stereoscopic 3D only goes up to 1080p24 (not 1080p30 due to a blanking interval between the left and right eye frames which puts 1080p30 just above the datarate limit), and the HDMI people never got around to updating HDMI’s 3D mode. This means stereoscopic 3D gaming on 3D TVs is limited to 720p60. I have heard tales of tricking your GPU to think your TV is an Acer Predator monitor that supports an unofficial extension that allows stereoscopic 3D 1080p60 and it might work, but that’s as far as I got.
I worked at a company that made interactive installations for museums. This often involved screens of various sorts, projectors and so on. We usually had a lot of freedom on what hardware to choose (as long as it was within budget), and we’d consistently choose DisplayPort and avoid HDMI. We always had trouble getting HDMI to work reliably and we just couldn’t afford to waste our (and our clients time) trying to get HDMI to work.
HDMI wasn’t designed to be the most reliable thing out there, instead it was designed to be simple to implement and most importantly provide HDCP encryption (so Hollywood would allow players to output HD when playing Hollywood-controlled formats such as Blu-ray). Reliability over long distances wasn’t a concern. If you don’t need to play DRM’ed HD content, use DisplayPort.
torb,
I’m curious how much of this was caused by DRM?
It sucks that while DRM is broken and becomes ineffective against those setting out to defeat it, it still sticks with us in our technology standards 🙁
Even if DRM is broken, it still legally imposes restrictions on mainstream devices due to the DMCA’s anti-circumvention provisions. For example, DVD-Video’s pathetic CSS encryption might be brute-forceable by a Pentium III, but the presence of CSS on most commercial DVDs still forces DVD player manufacturers to “license” CSS and implement all the restrictions that come with the CSS license (region lockout, unskippable ads, CGMS-A and Macrovision on the analog outputs, HDCP on the HDMI output).
Similarly, the HDCP encryption found in HDMI 1.x might have been broken, but to this day all mainstream HDMI capture cards will throw an error when encountering an HDCP signal. It’s the reason even Blu-Ray recorders usually have no HDMI inputs. You see, most devices enable HDCP in their HDMI output “just in case” (so there is no flashing if the user switches from non-DRMed to DRMed content), so an HDMI capture card or HDMI input on a Blu-Ray recorder is mostly useless without an HDCP stripper purchased from non-mainstream vendors.
So, even if DRM is cracked, Hollywood knows what they are doing when they continue to include it.
Also, HDCP 2.x has not been cracked to my knowledge.
Anyway, HDMI’s reliability problems are partly because of HDCP (which DisplayPort may also optionally have, just not consistently, so HDCP headaches are not exclusive to HDMI), the other big issue is the inherent fragility of HDMI due to the design decisions mentioned in the article. As I’ve said above, reliability is not a major concern during the design of HDMI. One of the concerns was compatibility with DVI-D in order to make implementation easier and allow for passive converters, so HDMI “inherited” the simplicity and fragility of DVI.
As I’ve said above, reliability is not a major concern during the design of HDMI = As I’ve said above, reliability was not a major concern during the design of HDMI
HDCP 1.x is so trivial to bypass with cheap adapters that it can be defeated unintentionally. There is no security. But I agree you make a good point with regards to legal matters. It doesn’t matter if the DRM is effective to invoke the DMCA. I still say it sucks though. Manufacturers are forced to implement ineffective DRM that increases costs, energy consumption, latency, inconvenience and still doesn’t even provide security against illegal copying. It’s all the cons of DRM with no pros.
I’ve heard some people say that HDCP 2 sources broke their televisions when they upgraded until they bought an adapter. I’m not sure exactly what those adapters are doing, but they might offer a way to defeat HDCP 2.x. I don’t own a blueray or high def TV myself so I don’t have personal experience with it.
IMHO reliability should be the primary goal, but yeah I know hollywood has a different agenda when they weigh in on standards.
HDCP 1.x relies on a handshake (think SSL/TLS handshake), so the proposition that it’s trivial to bypass, or that it can be defeated unintentionally, or that there is no security, is absurd.
Now, there are HDCP strippers out there, and I guess they can classify as “cheap” (depending on your wallet) and as an “adapter”, but what they are doing to remove HDCP 1.x is definitely not trivial. They have to present an HDCP 1.x-compliant sink on the input side (using a leaked key, fortunately the HDCP 1.x master key has leaked), receive the encrypted signal, decrypt it, and then regenerate it as cleartext HDMI signal. There was a HDCP stripper called HDFury that did that, but I can’t make sense of their current website.
The only way I can think of accidentally defeating HDCP is by using an HDMI regenerator (or splitter) that is not HDCP-compliant. Your source device (player, set-top box etc) will detect the absence of HDCP support in the sink and will silently downscale the content to 480p or 720p so it can output it as HDMI cleartext. But this doesn’t count as “defeating” HDCP.
Again, such “defeating” is most likely done by using a regenerator or splitter that isn’t HDCP 2.x-compliant and presents either a non-HDCP compliant or an HDCP 1.x-compliant sink on the input side, which means your set-top box will downscale to 480p/720p or 1080p respectively (and from HDR to SDR in both cases), but again, this isn’t “defeating” anything.
If you know how, please post instructions, because common knowledge says you can’t.
Let’s not give Hollywoold (HDCP) full credit, the electronics industry helped by basing HDMI on top of DVI-D, thus making sure HDMI inherits the fragility of DVI-D.
It’s actually true though. People who buy HDMI splitters and adapters may unintentionally defeat it.
I am admittedly not an expert, but I believe that some of these splitters work by allowing the television set to authenticate while allowing another HDMI device to piggyback on the same signal. Ironically, developing a splitter that has to authenticate two or more devices could actually require a more complex implementation. You asked about HDCP compliance, and they may not be, but anyway my point was that with some of these devices may bypass HDCP 1.x as a side effect.
Presenting a non-HDCP compliant sink so the source device outputs 480p or 720p does NOT defeat HDMI anymore than connecting via S-video does, please try to understand this very simple fact. HDCP exists to encrypt 1080p and higher output.
The correct way to make an HDMI splitter is to present an HDCP-compliant sink in the input, decrypt internally (like a TV would), and then output it into the two output HDMIs as an HDCP-encrypted signal, with each output being its own encrypted HDCP link that does its own HDCP handsake. But cheap splitters don’t do that and just present a non-HDCP-compliant sink on the input, which will result in 480p or 720p output by your source. Again, this is not defeating, any 1080p signal that will ever come out of your source will still be HDCP-encrypted as mandated by the HDCP rules.
Now, there are rumors about HDMI splitters that use something like the old HDFury device internally, but these are just that, rumors.
torb,
HDMI is not good for video distribution.
What you’d need is SDI, which uses digital coax connectors. A pair of these could be a starting point:
https://www.bhphotovideo.com/c/product/1375457-REG/blackmagic_design_convmbhs24k6g_mini_converter_hdmi.html
https://www.bhphotovideo.com/c/product/1375458-REG/blackmagic_design_convmbsh4k6g_mini_converter_sdi.html
Professional equipment like video cameras, switches and projectors will come with SDI ports:
The cables can easily run 100 meters:
https://www.extron.com/article/hdsdi_ts#:~:text=Cabling%20and%20Loss,distance%20of%20about%20100%20meters.
However, I think you’d not want to use native SDI projectors, they are in a completely different budget leage:
https://www.bhphotovideo.com/c/product/1647283-REG/panasonic_pt_rz690lbu7_6000_lumen_wuxga_laser.html