Inter-corporation bullshit screwing over consumers – a tale as old as time.
Major laptop vendors have quietly removed hardware decode support for the H.265/HEVC codec in several business and entry-level models, a decision apparently driven by rising licensing fees. Users working with H.265 content may face reduced performance unless they verify codec support or rely on software workarounds.
↫ Hilbert Hagedoornn at The Guru of 3D
You may want to know how much these licensing fees are, and by how much they’re increasing next year, making these laptop OEMs remove features to avoid the costs. The HEVC licensing fee is $0.20 per device, and in 2026 it’s increasing to $0.24. Yes, a $0.04 increase per device is “forcing” these giant companies to screw over their consumers. Nobody’s coming out a winner here, and everyone loses.
We took a wrong turn, but nobody seems to know when and where.

Such a wonderful world. As if there aren’t enough reasons already to steer away from these proprietary codecs for absolutely everything.
HEVC is not proprietary, it’s open-but-royalty-encumbered. Proprietary formats do not have a spec openly available and you may not be able to license the patents from the patent holders (no matter how much money you are willing to pay). Apple’s proprietary audio formats come to mind. And even in those cases that they let you license the patents, they may impose weird licensing requirements, like not being able to offer encoders for wide use or weird playback restrictions (the MQA format comes to mind, they did this to hide the fact it wasn’t lossless).
In plain English, open-but-royalty-encumbered formats are a PITA, but proprietary formats are much worse.
That said, the fact the HEVC patent holders were able to split between 8 separate patent licensing groups, with each group charging its own “reasonable” rate (with those “reasonable” rates stacking into a not-so-reasonable total royalty an implementer has to pay), shows how weakly “FRAND” is defined.
Does it really affect anyone though?
An Intel Core Ultra 5 135U can decode 8k video seemlessly using CPU only.
Anyone with a dedicated GPU can Easily surpass that.
The only niche I can think of is the very low end chips like the n150 in systems like NAS.
But these aren’t what is used in the devices we are talking about, which are still more than capable of decoding 1080p and 4k videos most office environments would need.
Feels to me more like the greed of the HEVC licenses made the OEMs actually Look at this tiny cost they’d been paying out and realised they didn’t need to anymore.
There are two main reasons I can think of: video editing on battery power, and playback of DRM-protected streams. The former may be niche, but I know that some video editors can make really good use of hardware decoders to speed up scrubbing of a timeline. The latter is frustrating and adds insult to injury, but typically streaming services that use DRM also require hardware decoding for the highest resolutions.
EDIT: And absolutely, it’s greed through and through. I expect this is a driver-level change that only affects Windows. Greed from the patent trolls, greed from the PC manufacturers wanting to differentiate product lines, etc.
The main problem is not video editing or video encoding on battery power, but video playback on battery power (for example Netflix 4K)
Which ironically used to be one of HEVC’s main strengths against VP9 and AV1: That pretty much all devices that claim to do “4K playback” can decode HEVC in hardware (while VP9 and AV1 decoding has to be done in software on older devices and as a result will drain the battery much faster). But now, HEVC will drain the battery on some new devices. Insane.
HEVC’s remaining strength against VP9 and AV1 is that early 4K TVs (late 2010s) can do HEVC playback in hardware, while their weak SoCs can’t do VP9 or AV1 software playback at all (not for 4K content anyway). So, streaming providers that charge extra for 4K HDR streaming (Netflix) will likely continue to use HEVC to not see those customers downgrade to the HD subscription tier.
Ah, I see you’ve mentioned video playback in your post but I skipped it while skim-reading.
Nope, I know precisely when and where: When the ISO bros and gals started accepting patented technologies into ISO standards without precisely defining what “FRAND” is.
But let’s look at the bright side, the open web managed to steer clear of the licensing mess that is HEVC, and has instead chosen VP9 and AV1 as the next generation codecs. Yes, H.264 is still big on the web, but that’s a legacy of Flash Player 10 (which came with a bundled H.264 decoder, which means video streaming sites have lots of content encoded in H.264), so H.264 in HTML5 was unavoidable Also, I am fully aware that HEVC exists on DRMed 4K HDR content (for example Netflix), but if you use your PC to view DRMed 4K HDR content, you have already made amends with the fact your system needs to have proprietary plugins such as Widevine L1, so HEVC system codecs are not that big of a deal in the grand scheme of things.