Inter-corporation bullshit screwing over consumers – a tale as old as time.
Major laptop vendors have quietly removed hardware decode support for the H.265/HEVC codec in several business and entry-level models, a decision apparently driven by rising licensing fees. Users working with H.265 content may face reduced performance unless they verify codec support or rely on software workarounds.
↫ Hilbert Hagedoornn at The Guru of 3D
You may want to know how much these licensing fees are, and by how much they’re increasing next year, making these laptop OEMs remove features to avoid the costs. The HEVC licensing fee is $0.20 per device, and in 2026 it’s increasing to $0.24. Yes, a $0.04 increase per device is “forcing” these giant companies to screw over their consumers. Nobody’s coming out a winner here, and everyone loses.
We took a wrong turn, but nobody seems to know when and where.

Such a wonderful world. As if there aren’t enough reasons already to steer away from these proprietary codecs for absolutely everything.
HEVC is not proprietary, it’s open-but-royalty-encumbered. Proprietary formats do not have a spec openly available and you may not be able to license the patents from the patent holders (no matter how much money you are willing to pay). Apple’s proprietary audio formats come to mind. And even in those cases that they let you license the patents, they may impose weird licensing requirements, like not being able to offer encoders for wide use or weird playback restrictions (the MQA format comes to mind, they did this to hide the fact it wasn’t lossless).
In plain English, open-but-royalty-encumbered formats are a PITA, but proprietary formats are much worse.
That said, the fact the HEVC patent holders were able to split between 8 separate patent licensing groups, with each group charging its own “reasonable” rate (with those “reasonable” rates stacking into a not-so-reasonable total royalty an implementer has to pay), shows how weakly “FRAND” is defined.
Which Apple’s proprietary format are you talking about? Regarding audio, they use AAC which was part of the MPEG-4 standard or ALAC which was created to fill the gap of the absence of a lossless codec in this standard. The latter is open-source and royalty-free.
Does it really affect anyone though?
An Intel Core Ultra 5 135U can decode 8k video seemlessly using CPU only.
Anyone with a dedicated GPU can Easily surpass that.
The only niche I can think of is the very low end chips like the n150 in systems like NAS.
But these aren’t what is used in the devices we are talking about, which are still more than capable of decoding 1080p and 4k videos most office environments would need.
Feels to me more like the greed of the HEVC licenses made the OEMs actually Look at this tiny cost they’d been paying out and realised they didn’t need to anymore.
There are two main reasons I can think of: video editing on battery power, and playback of DRM-protected streams. The former may be niche, but I know that some video editors can make really good use of hardware decoders to speed up scrubbing of a timeline. The latter is frustrating and adds insult to injury, but typically streaming services that use DRM also require hardware decoding for the highest resolutions.
EDIT: And absolutely, it’s greed through and through. I expect this is a driver-level change that only affects Windows. Greed from the patent trolls, greed from the PC manufacturers wanting to differentiate product lines, etc.
The main problem is not video editing or video encoding on battery power, but video playback on battery power (for example Netflix 4K)
Which ironically used to be one of HEVC’s main strengths against VP9 and AV1: That pretty much all devices that claim to do “4K playback” can decode HEVC in hardware (while VP9 and AV1 decoding has to be done in software on older devices and as a result will drain the battery much faster). But now, HEVC will drain the battery on some new devices. Insane.
HEVC’s remaining strength against VP9 and AV1 is that early 4K TVs (late 2010s) can do HEVC playback in hardware, while their weak SoCs can’t do VP9 or AV1 software playback at all (not for 4K content anyway). So, streaming providers that charge extra for 4K HDR streaming (Netflix) will likely continue to use HEVC to not see those customers downgrade to the HD subscription tier.
Ah, I see you’ve mentioned video playback in your post but I skipped it while skim-reading.
Yes, apparently it only affects Windows. It’s done through ACPI tables rather than by driver modification, but Linux probes the hardware directly and ignores the table involved for detecting features.
https://www.reddit.com/r/sysadmin/comments/1opxue7/comment/nnfyzvi/
andrew_w,
Thank you for posting the link. That redit post may not be definitive though, especially the 2nd hand claims saying that HEVC is working because linux might be using another accelerated HEVC code paths than the one in question.
https://wiki.archlinux.org/title/Hardware_video_acceleration
When running ffmpeg it outputs a line like this that doesn’t tell me much about the hardware it chose to use. “Using auto hwaccel type vdpau with new default device.”
Does anyone know of a trick to see what hardware is actually being used when a video is playing? I don’t know what tooling on linux would tell me that. I can see activity go up in top as well as nvidia-smi during playback but it doesn’t clearly indicate what is actually decoding the stream.
Note I don’t have one of these laptops, but I’m still curious to find out how video is being played back on my hardware. It’s always been “hands off” and I haven’t paid it much attention.
That raises the question: Who is responsible for patent infringement here (since HEVC decoding is enabled without license fees having being paid): The OS? The laptop vendor? The user?
kurkosdr,
The answer usually is “whoever the lawyers think they can extract money from”
Did Microsoft promise it? Pressure on Microsoft
Did Dell advertise the feature? Milk them for money
The user probably last in chain, only maybe with a third party app to unlock license and some strong nudge to “do the right thing”
Nope, I know precisely when and where: When the ISO bros and gals started accepting patented technologies into ISO standards without precisely defining what “FRAND” is.
But let’s look at the bright side, the open web managed to steer clear of the licensing mess that is HEVC, and has instead chosen VP9 and AV1 as the next generation codecs. Yes, H.264 is still big on the web, but that’s a legacy of Flash Player 10 (which came with a bundled H.264 decoder, which means video streaming sites have lots of content encoded in H.264), so H.264 in HTML5 was unavoidable Also, I am fully aware that HEVC exists on DRMed 4K HDR content (for example Netflix), but if you use your PC to view DRMed 4K HDR content, you have already made amends with the fact your system needs to have proprietary plugins such as Widevine L1, so HEVC system codecs are not that big of a deal in the grand scheme of things.
kurkosdr,
I believe this might have been the point all along. At least for some of the members who influenced these decisions.
Similar things happened with “free” OTA TV. The new Antenna specification (ATSC 3.0) snuck in always online draconian DRM into the standard, promising a way to allow older TVs to function. Unfortunately it kills the entire DVR market, along with computer based TV watching (like Plex)
They have successfully shut down the “analog hole”, and they are working to make sure the digital one is out of reach for regular people, but only under very limited and restricted circumstances. (industry approved devices, and even those subject to being blacklisted if necessary, or features removed in updates). User agency removed in lieu of increased vendor controls.
I’m not expecting it will get any better, soon. Hope I’m mistaken.
“You will own nothing, and you will like it” was this the slogan?
ATSC 3.0 having DRM encryption is a uniquely US issue, free-to-view 4K HDR broadcasts on other countries are cleartext (and yes, you can record them to industry-standard .ts files with a set-top-box). Something similar exists in the UK, with HD channels broadcasting their EPG encrypted and demanding from device vendors to implement the “Freeview HD” DRM to get access to the EPG decryption keys, but again, this is isolated to one country (though unlike ATSC 3.0 DRM, the audio and video are cleartext and can be recorded manually or by getting EPG data from the internet, but you get the point, those cases are the exception, not the rule). Even on the web most free-to-view content is DRM-free. So, free-to-view content has been largely spared of DRM.
When it comes to pay-to-view content, each new resolution has introduced progressively harder to crack DRM, so nothing new here. Buy physical media if you want ownership, while physical media is still being made.
kurkosdr,
Yes, ATSC is a uniquely American thing (maybe Japan uses same, but I’m not sure).
The reason the draconian ATSC 3.0 DRM were introduced were we are “outsmarting” them.
(The “retransmission” loopholes. Even though the courts sided against those services, like Aereo, which used “one antenna per person” but still found liable).
I find all of this frustrating, the public spectrum is assigned to these channels as a public good. They have to be accessible to anyone, but they use these ridiculous loopholes to shirk the meaning of the law, while still technically staying in the letter of it.
Yes, this is one place I agree DRM is a necessary evil. Nobody would offer “pay for view” if everything can be “taped” at will. (That is why I don’t personally make “backups” of things I lend from a library as well).
Other countries are using ATSC 3.0 too:
https://en.wikipedia.org/wiki/ATSC_3.0#Countries_and_territories_using_ATSC_3.0
But DRM in free-to-view ATSC 3.0 channels is a uniquely US issue.
As you said, the Aereo loophole was declared illegal by the US’s highest court, but anyway, there were efforts to restrict home recording of free-to-view broadcasts before in another country (see the “Freeview HD” situation I described above). Let’s be real here, there is a business case for allowing people to view broadcast content for free, but not record it (since not allowing viewers to own a recorded copy of the content and/or skip ads is financially beneficial to the content cartels), it’s up to regulators to enforce no DRM in free-to-view broadcasts.
Yes, it’s a regulatory failure in the US, In most countries, broadcasts have to use ISO, ITU, and either ETSI or ATSC standards, and DRM is proprietary/closed-spec by nature, which is why free-to-view broadcast content has been spared from DRM in most countries.
The silver lining is that the US regulator is accepting comments over the issue, so if you live in the US, you can comment:
https://www.youtube.com/watch?v=wEf2Jot7ZQM
(please watch the video, the uploader is on our side and tells you how to do it properly)
And yet, VHS and VCD didn’t have any kind of spec-defined DRM (any DRM on those was ad-hoc kludges) and were massive successes and generated lots of cash for the content cartels. Also, you can find even UHD content (Blu-ray 4k and streaming) available as an 1:1 copy at the usual places, so it’s not like DRM on audiovisual content is even effective against real piracy, DRM is only there to infringe on fair-use rights. So, it’s not a necessary evil, it’s just greed by the content cartels. But again, that fight ended in the HD era, when the DRM for HD content turned out more draconian than the one for SD content and people embraced it, just be glad free-to-view is DRM-free in most countries.
kurkosdr,
Thanks, I did not realize it was more than the USA, but basically some American continent ones… and for some reason South Korea?
In any case thanks for sharing the link to make official comments. This should be heard more, they are trying to steal our public airwaves for completely private use.
sukru,
“You will own nothing, and you will like it” was this the slogan?
Might as well be. I feel like “Ownership” as a concept in general is slowly dying. From digital content, phones and other products where owners don’t get the keys to their own devices, computers remotely imposing vendor control over owner rights, IOT devices that do the same. All of this sucks.
Even home ownership itself is in jeopardy. US officials are looking at 50year mortgages…
https://www.redfin.com/news/50-year-mortgage-explained/
To the extent that this could actually become a reality, it will mean that those who don’t get help via inheritance could end up effectively paying rent to the bank from the moment they move in until they retire and then needing to reverse mortgage the house without ever having owned it outright. Home ownership increasingly requires multi-generational wealth. I mean it is what it is, but it’s depressingly regressive outlook compared to older generations.
Alfman,
This is because a certain older generation have wasted all their “inheritance” from the “golden generation”, have spent all they have accrued, and now want to suck out life blood from future ones to ensure their luxurious retirement, and being able to spend “golden years traveling the world in a cruise”
I might be harsh, but historically almost all generations ensured their kids were better off. However in the USA not only the welfare of recent generations are lower than their parents, the average IQ itself is measurably decreasing.
https://www.popularmechanics.com/science/a43469569/american-iq-scores-decline-reverse-flynn-effect/
Basically we are living in the universe where Idiocracy is a documentary.
We k ow exactly when and where. It’s software patents. Sometimes its very simple.
Luckily, 100% of computers affected by this have hardware decode for VP9 and AV1, which streaming services that aren’t Apple use over H265 anyway.
While I am sure it is for selfish reasons, we should be applauding that this is happening. Freeing consumers from H.265 is a good thing.
The H.265 consortium raises prices 25% and we are on their side?
From the article, “While many streaming services already favour other codecs, the change is significant for users working with H.265-encoded material”. Streaming services already offer AV1 and these laptops support AV1 decode in hardware. H.264 as well. H.265 has never been a standard codec in web browsers. All browsers support AV1 (and H.264 of course). So, why do we need H.265 which embodies the worst forms of corporate greed and corrupt licensing? We should all be using something else.
This is one where I assume that the Free Software Foundation and I are on the same side. With AV2 imminent, we should be telling these encumbered format licensing pools to pound sand. I am happy to have these big companies on our side in that effort.
https://aomedia.org/press%20releases/AOMedia-Announces-Year-End-Launch-of-Next-Generation-Video-Codec-AV2-on-10th-Anniversary/
Where did things go wrong? How about when we started demanding that companies pay these evil consortiums on our behalf instead of supporting the open alternatives that exist?
I worked in video surveillance where H.265 has been the standard encoding format for a long time. So, I get there is a lot of hardware out there producing H.265 content. All the more reason to be pushing for a more open future.
I encode all my video as AV1. For PC playback or editing, I can keep it in that format even on ancient machines. For some devices, like my older TV, it gets transcoded on the fly to H.264 for playback. I actively avoid H.265 and am looking forward to AV2. So, I am quite happy to see H.265 getting dropped by popular laptop makers. They have my support.