We’ve all had a good seven years to figure out why our interconnected devices refused to work properly with the HDMI 2.1 specification. The HDMI Forum announced at CES today that it’s time to start considering new headaches. HDMI 2.2 will require new cables for full compatibility, but it has the same physical connectors. Tiny QR codes are suggested to help with that, however.
The new specification is named HDMI 2.2, but compatible cables will carry an “Ultra96” marker to indicate that they can carry 96GBps, double the 48 of HDMI 2.1b. The Forum anticipates this will result in higher resolutions and refresh rates and a “next-gen HDMI Fixed Rate Link.” The Forum cited “AR/VR/MR, spatial reality, and light field displays” as benefiting from increased bandwidth, along with medical imaging and machine vision.
↫ Kevin Purdey at Ars Technica
I’m sure this will not pose any problems whatsoever, and that no shady no-name manufacturers will abuse this situation at all. DisplayPort is the better standard and connector anyway.
No, I will not be taking questions.
HDMI, as long as it continues to stay a closed expensive licensable spec, needs to die.
DP doesn’t have this problem.
Sadly HDMI won. People don’t care that is “closed expensive licensable”, all monitors and TVs come with them, cables are cheap, and everybody knows about it.
The vast majority of displays and devices have HDMI only. People that say things like this are really delusional.
Out of sheer curiosity, what was the HDMI Forum supposed to do? You cannot have higher data rates without a more expensive cable, either we’re talking about a cable that has more conductors in it (the USB 3.0 way) or a cable that has higher-quality conductors in it (the HDMI way, since the HDMI connector is pretty packed with pins already). The best the HDMi Forum could do was to trademark a name such as “Ultra96” and enforce it as best they can. Which is what they did.
DisplayPort is irrelevant for TVs because it has no ARC/eARC, which is the only way to connect your device to a Dolby Atmos receiver (unless your device has a second dedicated HDMI port just for audio, which only a handful of Blu-ray players have).
For the nitpickers: Yes ARC can work with lossless Dolby Atmos if your source has the ability to transcode it to lossy Dolby Digital Atmos (aka E-AC3+JOC), you don’t necessarily need eARC.
A problem to solve instead of investing time and resources in a closed format.
What is the “problem to solve”?
– If the “problem to solve” is the lack of ARC/eARC support in DisplayPort, then the VESA dullards would have added it years ago (back in 2009 when HDMI got ARC), they just don’t care about such “consumer concerns”. This makes HDMI a must-have for setups with a Dolby Atmos receivers.
– If the “problem to solve” is ARC/eARC itself, it’s unfortunately a requirement due to HDCP. You see, the HDCP people charge a royalty per port, and ARC/eARC allows source devices to have only one HDMI output (that outputs both video and Atmos audio) and then let the TV split the Atmos audio from the video and pass the Atmos audio to the receiver via ARC/eARC. Atmos will never happen over SPIDF or even USB-C because those don’t do HDCP, and Hollywood requires that Atmos only travel over an HDCP link. Even GPUs don’t have a second dedicated HDMI port for audio, precisely because they want to save on royalties, only a handful of expensive Blu-ray players have it.
Really, I don’t see how the problem can be solved, regardless of the definition of the problem.
HDCP is horse shit, that makes everything expensive an incompatible – without ANY benefit, to anyone, including the content owners. Let’s not pretend that has any merit or value.
I don’t think kurkosdr was defending HDCP, just pointing out that we all have to deal with it. I’m sure they would agree that it has no merit or value. It certainly doesn’t for us mere consumers!
I agree with Thom that DisplayPort is overall the better standard, and if not for media companies’ desire for control over anything media related, it would be the default connector on consumer devices just as it is on PCs and other IT equipment.
HDCP served its intended purpose, which was to make HDMI inputs unattractive in most devices. No manufacturer wants to advertise an HDMI input on their Blu-ray recorder that will fail to work most of the time. Keep in mind that not everyone is tech-savvy enough to find decryption software such as MakeMKV, and then there are things like cable/satellite boxes where no straightforward decryption method exists. So, a lack of HDMI input in common devices goes a long way towards reducing casual recording and copying. This is what Hollywood wanted and they got it thanks to the DMCA (which outlaws “unofficial” implementations of DRM that don’t comply with Hollywood’s demands), end of story.
DisplayPort ports can have HDCP[1], it’s just not mandatory for a TV or monitor to be able to use the DisplayPort name and logo, but it can be present on DisplayPort ports on an optional basis (and most TVs and monitors do indeed have it for compatibility reasons). My personal gripe is that there is no special logo for DisplayPort ports that have HDCP, so there is no guaranteed way to know a given DisplayPort port has HDCP, but again, most have it.
As I’ve said above, the main issue with DisplayPort is the total lack of support for even basic ARC (let alone eARC), it’s a boneheaded omission that gives HDMI a bonafide reason to exist.
[1] https://www.displayport.org/faq/
CaptainN-,
This is the standard refrain for all DRM. People hate it, but publishers keep demanding it and manufacturers keep delivering it. Yes it can interfere with legitimate uses, but it doesn’t matter. Yes, it’s broken so the real pirates can bypass it, but it doesn’t matter. Publishers still lobby collectively to keep everything infected with DRM anyway and they carry a lot of weight in our tech standards. It’s just the way it is and probably the way it will always be.
And what you don’t understand is that DRM is not there to stop piracy (it won’t), it’s there to restrict fair-use rights. For example, DVD’s CSS has been broken to bits (and is easily bruteforce-able anyway because the key is too short), but it has served its intended purpose: stop wide availability of ripping tools. WIndows Media Player can rip CDs to MP3’s but can’t rip DVDs to Divx avi or MP4. Same for iTunes. So, when it comes to DVDs, you have to buy the same movie again as streaming for your tablet/smartphone/smart TV. And that’s where the real money from DRM is, not from the impossible task of “combating piracy”.
But good luck explaining all that to your Average Joe, and that’s why the DMCA’s anti-circumvention provisions are unfortunately not going anywhere.
“Out of sheer curiosity, what was the HDMI Forum supposed to do? You cannot have higher data rates without a more expensive cable, either we’re talking about a cable that has more conductors in it (the USB 3.0 way) or a cable that has higher-quality conductors in it (the HDMI way, since the HDMI connector is pretty packed with pins already). ”
No, that’s not true obviously. The other option, would be to develop a better encoding for the data. Think about how upgrading from a 14.4d to a 28.8k modem surprisingly didn’t require you to rewire your phone cord. Of course, if it were easy they would have done it. Changing the cable format is a worse choice for most of the participants. So I assume that they didn’t have any easy protocol improvement they could throw at it. Or the tin foil hats are right and the accessory minded participants won the argument. But I don’t believe in tin foil hats.
All good practical encodings for short-to-medium cables have already been invented. Display port and USB-C have the same problem that HDMI has (you need a better cable for the higher data rates).
The difference is that HDMI has trademarked certifications for cables that can do the higher data rate while DisplayPort and USB-C are a wild west.
Bill Shooter of Bul,
The range of a cable is extremely dependent on signal frequency. You can go higher if the bandwidth isn’t already close to being maxed out, but most modern data cables are already being pushed near the max frequencies they’re engineered for. Look at the chart for “Values of primary parameters for telephone cable”…
https://en.wikipedia.org/wiki/Telegrapher's_equations
Notice that at low frequencies, doubling has very little impact on resistance. This is why modem speeds could be increased without upgrading wires. But as frequencies keep going up the resistance gets exponentially more pronounced. If the run is short enough, then there isn’t much signal degradation and it can still overpower the noise. But as the cable gets longer the signal drops and the noise increases. This is where higher quality cables are needed.
You can go beyond the cable’s specs, but you’ll start to get errors. For example some of us have tried 10gbe on cat 5e cables. It actually works ok for short runs in environments with low electrical noise. But it doesn’t take much to start getting data errors. Even these errors might be tolerable depending on what higher level layers do. Computer networks are quite tolerant of this and if your not explicitly testing for lost packets you may not even notice. As far as I know HDMI does not do retransmission, errors will turn into audio/video artifacts. If you find someone who’s tested this stuff specifically, it could be an interesting read.
Personally I haven’t tested HDMI cables, but I have tested very long USB cables – around 30m, both passive and amplified. In my testing I found that active cables solve the signal quality problems, but it appears that some hardware/drivers have critical timing requirements. If you try to go too far (around 10m with two repeaters IIRC) data packets can start to drop. For example hard drives would work absolutely fine the full length, but my logitech web cams wouldn’t work reliably. As far as I can tell this was not electrical signal quality but the hardware/driver’s inability to deal with latency.
kurkosdr,
This makes it even more confusing for USB-C / Thunderbolt connector users. Even though my monitor has support, and my computer, too, they would not negotiate properly.
Add in docking stations to the mix, and it gets even worse. Which ports work? Which don’t? Do we need adapters? And then they are sometimes hidden. The type-C cable can carry HDMI, but most cables are internally type-C -> DP and then DP-> HDMI which “erases” capabilities.
Wish they could come up with a better “Lowest common denominator” between these three standards.
This is why most GPUs and laptops have one HDMI port, so you can connect to a TV without all those adapters. This gets you the full HDMI feature-set, including (e)ARC.
The “lowest common denominator” is the intersection between DP and HDMI. Just make sure the docks you buy have the ports you want And avoid USB-C for video output, it’s buggy as hell usually.
kurkosdr,
Yes, in my previous setup I had to use a type-C port on the dock to connect my display. And I agree, it was really buggy.
We have all this technology, and not being able to solve a simple thing as connecting two devices reliably (actually with a dock, but…) is disappointing.
I don’t think that makes display point irrelevant, I have no idea what ARC/eARC is or Dolby Atmos. I could literally use it with any of my equipment and it would work fine.
You don’t know what Dolby Atmos is? Have you be living under a rock for the past 10 years?
No and I kind of doubt most people do? I’ve never heard of it and I really doubt I own anything that can use it.
Anyone who has been to a cinema during the past 10 years should have heard it. It’s the next step after surround sound. It places speakers around you and above you instead of just around you like surround sound does. It also allows some sounds to have their position defined as 3D co-ordinates and the decoder/receiver “distributes” those sounds to the number of speakers your setup has depending on the coordinates of the sound (which allows the format to cover an infinite number of speakers). DTS:X is a competing format from DTS. The generic term is “object-based audio”.
tl:dr: Modern home-cinema systems don’t just have speakers around the viewer but also above the viewer (or have up-firing highly-directional speakers), courtesy of Atmos and DTS:X. DisplayPort having pretty much zero support for Atmos (or DTS:X) is a major omission that makes DisplayPort irrelevant in the living room.
I have strong doubts that a vast majority of people will notice or care that they “only” have surround sound and not whatever this is. Audiophile sure, but most people barely notice more than stereo + subwoofer, and I think it’s pretty silly to say that not having this makes something irrelevant.
Overhead speakers is something you definitely notice. Sound either comes from the ceiling or it doesn’t, it’s not subjectve “audiophile” garbage but something noticeable by everyone. When I first heard Atmos at the cinema, I was floored, sound was literally coming from every direction.
kurkosdr,
IMAX has this and I would think that’s more likely to have content specifically recorded with overhead channels at the source. It makes me wonder if the average movie records this stuff at all. Otherwise it has to be synthetically blended.
I personally didn’t know Display Port had this difference, and I agree with you it sucks. But regardless of that I don’t think I know anyone with that kind of setup. Even just plain surround sound is kind of rare. I don’t know if this is a safe generalization, but it seems to me that more people used to have surround sound (ie with physical speakers) but they have been mostly replaced with a single sound bar using DSP to create the illusion of depth. IMHO unless you are very close, such soundbars don’t even give proper stereo effects, much less surround sound.
I think that the DSP significantly alters the sound to create “spacial effects” (almost tinny quality) but in doing so tarnishes audio clarity. Funnily enough, on new years we were watching a show and the owner had to switch audio output modes to make voices clear. since a number of us, myself included, had trouble hearing them even when the volume was up. As popular as they are, sound bars can’t provide the same quality as a surround setup.
IMAX audio never had a consumer variant, so for consumers, Dolby Atmos and DTS:X were the first formats that had channels for overhead speakers. Also, IMAX is unavailable or rare in some regions, so for some of us, Atmos was our first experience with overhead channels even in the cinema.
BTW movie sound is rarely recorded multichannel on set, sound is collected using an array of mono speakers and then (together with stock sound effects) is mixed into channels (or objects) in post-production, so that’s where Atmos sound is mixed (which was also the case for 5.1 and 7.1 audio). So yes, Atmos and DTS:X movies do have real overhead sound for the overhead speakers.
This makes ARC and eARC more important, since soundbars can’t function as AV receivers (like most home cinema systems can). They only have a single HDMI input, so video passthrough is not possible, you have to use ARC/eARC. You can theoretically use SPIDF/TOSLINK (most TVs still have a SPIDF/TOSLINK output), but then the up-firing speakers of your soundbar won’t get true overhead channel information and they will have to synthesize it (basically it’s like feeding a Dolby Digital receiver stereo sound and relying on Pro Logic), since SPDIF/TOSLINK cannot carry Atmos or DTS:X (no, not even the lossy variant of Atmos).
tl;dr: Anyone using a soundbar is using ARC/eARC these days. And while soundbar sound isn’t true surround (much less true Atmos), it’s better than the TV’s speakers, so people buy them in droves. And this highlights the bonehead-ness of DisplayPort for not supporting ARC/eARC in any capacity. But I guess this is a “consumer concern”, DisplayPort is a pure professional standard, untainted by the market realities of the consumer space.
Atmos is now the industry standard format for surround sound. Every AVR sold in the past 5 years (or more) supports it.
Unlike most other audio formats, Atmos doesn’t require the studio to mix the sound for a particular speaker layout. Instead, the sound designer places the sound in a 3 dimensional space. The processor, either in a theater or home theater, will then decode the stream and direct the sounds to the appropriate speakers. So the same “mix” is used for 32 channels in a commercial theater or your more traditional 7-11 (plus subwoofers) in a home theater.
Also, don’t get so high and mighty on DisplayPorts superiority. DP still has limitations on cable length, and it’s own compatibility problems if you use the latest hardware with an older cable…or if your cable is too long for the resolution and frame rate being displayed.
There is literally no benefit of DP over HDMI, they both operate in a very similar manner, although HDMI 2.1 has to rely on a bit of compression and 4:2:2 color at 8k resolutions, while the current version of DP doesn’t have the same limitation…however, HDMI 2.2 will address the current limitations. Sometimes DP will put out new features or support new higher resolutions and frame rates before HDMI, but that doesn’t mean it’s “better”. It is just a different standard. That’s it.
kurkosdr,
I didn’t think it was common. Of course it’s one thing if the movie takes place in virtually created 3d space, but adding depth information in post where none was recorded originally seems like the audio equivalent of fake 3d, the results won’t be as good unless you keep the effect very subtle. I don’t know maybe AI could do a better job generating multichannel information where none was actually recorded.
I have no experience with the ARC/eARC hardware of which you speak, however my point was that if you have a limited living room setup, then I don’t see why that would really benefit over the “5.1” input that it natively supports. It physically can’t create more sounds regardless of how those sounds are represented in the protocol.
I’m not saying that more sophisticated multichannel setups won’t benefit from advanced protocols that support those channels. That makes sense to me, but very few households have a proper surround sound setup these days and I don’t see nearly as much value in having those extra channels if your setup can’t reproduce those channels authentically.
Like I said, I agree displayport shouldn’t be limited and it sucks that it would be. But even so I’m not convinced adding more channels makes a big difference in a setup where most of the sound will be coming from one direction anyway. I’m not against the improvements that you want, but I don’t think most people can experience the benefit given the setup they have anyway.
I do understand that DSP solutions can create the illusion of depth by changing audio characteristics, adding fake reflections to make sound seem like it’s coming from elsewhere, but this makes the audio much less clear. I’m not a fan of this technique for the reason I explained earlier. I’m not against having genuine surround speakers, which I think used to be more common, but as far as I know it’s rare to have such a setup these days and it’s in this context that I question the benefit of more digital channels.
Dolby Atmos is a hybrid format: it has a number of “bed channels” that are traditional channels plus sound objects that are placed in 3D space using 3D co-ordinates as you said (and the decoder “distributes” the sound to your speakers according to each object’s 3D co-ordinates, so it can take advantage of an infinite number of speakers):
https://professionalsupport.dolby.com/s/article/What-is-a-bed?language=en_US
The soundbar form factor can support a minimum of 2 upfiring speakers, so a well-designed soundbar with highly-directional upfiring speakers (with insulation on all sides and the bottom so sound travels only upward to be reflected by the ceiling) can create at least 2 overhead channels. But here is the problem: traditional 5.1, 6.1 and 7.1 audio as supported by SPDIF/TOSLINK supports zero overhead channels, so you need ARC/eARC to deliver overhead channel information to your soundbar (more accurately, ARC/eARC delivers a Dolby Atmos or DTS:X stream that has the overhead channel information).
What you are vaguely describing is virtualized height, and yes, it sucks. Doesn’t work, avoid like the plague etc etc. Good soundbars have up-firing speakers. Dolby briefly acknowledges that a difference between the two (upfiring vs virtualized height) exists:
https://www.dolby.com/about/support/guide/speaker-setup-guides/
“Check out our soundbar setup guide where you’ll learn the difference between upward-firing soundbars and those with height virtualization”
But if I have one gripe with Dolby, it’s precisely how the Atmos logo is plastered on devices that can’t deliver even a minimum of 2.1.2 experience and instead rely on height virtualization tricks. But you know, you can always avoid those crappy products.
This doesn’t nullify the fact that ARC/eARC is needed if you have a soundbar with upfiring speakers or even a proper 5.1.4 or 7.1.4 setup.
kurkosdr,
I know the traditional formats don’t support overhead, I agree with you already and I’ll keep agreeing with you on that – no need to keep repeating it 🙂
However I don’t think soundbars are nearly as effective at surround sound even if they advertise it. Maybe you get better results in lab conditions, but it’s unlikely to match the quality of actual speakers. I believe this spacial projection is why the audio was hard for everyone to hear the other day. And our host was accustomed to changing this mode on account of this very problem.
While I don’t disagree with you that overhead channels could be cool provided the content made good use of it, I remain highly skeptical to the notion that a typical home soundbar setup can pull it off. In my experience these are much worse even at basic stereo. Old stereo speakers had more displacement. I concede my experience is limited and it may be that I’ll only be able to believe it once I hear a system working as well as claimed in person.
I’ll have to check the model next time I see them, but I assumed it was virtualizing depth rather than height. Either way I’m skeptical 🙂 At another relative’s house they have a new setup using a soundbar with a laser projected screen, they paid a lot for it and to be honest I never noticed any depth. The room is fairly large and open so maybe it just doesn’t work there. I’ll have to pay more attention – I haven’t been going to our friends homes with the intention of being an audio critic, haha.
Oh boy if I sent you a picture of our setup you’d have such a laugh – “crappy” doesn’t begin to describe it 🙂
They are not effective at delivering surround sound but are effective at delivering 2.1.2 audio (at least the good ones with upfiring speakers). And you need ARC/eARC to get the .2 part of 2.1.2, that’s my point. So, even assuming the bottom-tier soundbar scenario, you need ARC/eARC, So, DisplayPort can’t even cover the bottom-tier soundbar scenario, and that’s why DisplayPort sucks in the living room.
HDMI is more than 20 years old, in most regions there is nothing they can enforce patent related and the connector is unchanged, so I suppose a new standard is the way to make sure the fees keep rolling in. Even if the new claims are as bogus as an ISP that can suddenly pump 2GB down the same fibres that could only delivery 200MB a few years ago!
I suspect it’s just another case of “Our gold cables are better, if you listen carefully you can hear it!”
Up until fairly recently (2017), new HDMI versions were very welcome, as they introduced things like eARC and the ability to do 4K HDR at up to 144Hz framerate. This is especially true for TVs, since TV manufacturers don’t want to waste an entire port for a DisplayPort connector that can’t do Atmos passthrough, so HDMI is your only choice when it comes to TVs (so, progress on the HDMi front is very welcome). This is objectively measurable performance btw, not fake claims like audiofool cables.
So, HDMI is set until at least until 2037 as far as patents are concerned.
Now whether HDMI 2.2 gets adopted, that’s dependent on whether 8K or 4K HDR stereoscopic gaming find a market.
kurkosdr,
One advantage of higher generation cabling is being able to “over provision” to avoid signal issues.
For connecting a 4K monitor at 60Hz, HDMI 2.0 might be sufficient. However I would step up to 8K capable cables that are overkill for the purpose, but will give me significant headroom for the signal. And in the future if I ever upgrade, then the cable will already be there.
(With I could just use type-C, though, we are so close to having a single standard cable for everything).
It’s also worth mentioning that even if the HDMI patents have expired for early versions of HDMI, HDCP still remains royalty-encumbered because of the DMCA (in countries where the DMCA or similar laws apply), so you can only implement cleartext HDMI without paying royalties (either in a source or a receiver). Also, you can’t use the term “HDMI” since it’s trademarked, so you have to either leave the port unlabelled or label it something generic like “Digital AV out”.
Cleartext HDMI is practical on a source device if your device isn’t meant to officially implement any DRM, that’s how some FPGA projects have HDMI output: they only output cleartext HDMI (and they also leave the port unlabelled or have it generically-labelled). Cleartext HDMI on receiver devices (for example TVs and monitors) isn’t practical obv.
@kurkosdr – Thanks, that explains something I wonder about on new generation low cost systems that often have a unlabelled HDMI port, and most of what I’m thinking of is FPGA / ASIC based like low cost digital oscilloscopes with unlabelled HDMI ports.
I gather though on the Trademark issue, you can make the cable without the labels but list “compatibility” on packaging, which won’t hinder the low cost / grey market at all.