The requirement for a fast CPU to decode H.264 HD video content will soon be a thing of the past. ATI (slides on Xbit Labs’ site) is preparing a hardware-based decoder enhancement for future boards. With Texas Instruments, Analog Devices and other manufacturers releasing H.264 hardware decoders, it won’t be long before NVidia and other manufacturers catch up, however ATi is not the first manufacturer to offer this specific feature.
It’s the same reason Amiga got behind, relying on too much integrated chip. It’s the reason why you are not close to see voxel or ray tracing video card. Nobody will develop those because consumer would already have an “integrated” good enough solution already.
Once something cool is made in software as a proof of concept, it will stay that way never to see mainstream.
Remember when Dragons Lair and Zork III required a special mpeg decoding card? It sound silly by now right, well it’s just as silly with that “technology” today.
With every one wanting computer to go more mobile with more battery power, why not build more general purpose computer.
So what the big deal about HW decoding? HW decoding is only important to standalone player.
Probably, If you watch too much H.264 HD video on your Apple or Media Center PC. You would have the hardware decode it rather than depending on Software decoders that are CPU hungry.
Many manufacturers currently building OEM and prototype boards are using DSP decoders which allow you to hardware-accelerate a codec (some support hardware-assisted encoding as well as decoding). Your current video card probably already has MPEG acceleration or decode assist acceleration of some sort. This is just the same thing, done smarter and with more flexibility, for next-gen HDTV codecs. Eventually we will see this integrated into a single chipset. It will just take time ;-).
It makes sense… you don’t like your CPU to peak above 3% usage when decoding a video, who knows how many fps that might cost you.
I guess it’s better than nothing as long as you don’t have to pay for it. For example I happen to have my Ati Powered WMV HW decoder deactivated cause it completely mess up the colors of half of my HD videos for 5 driver revisions now.
Most of you are completely missing the point. The point here is that you need a VERY powerful CPU in order to decode H.264 video at full framerate (3.0 GHz P4 being a bare minimum I believe). Most consumer PCs aren’t powerful enough to decode this — and when you bring HTPCs into the question, it becomes obvious that hardware-accelerated H.264 is a good thing.
Many people use low-power CPUs such as the C3 for simple HTPC applications. A C3 can’t decode anything close to H.264 — with this it can, and still have many clock cycles left over.
Quit spreading FUD. ATI hasn’t had driver quality issues for at least two years now, possibly three. I deal with video cards every day, and from what I’ve seen, ATI’s drivers are higher-quality than NVIDIA’s.
Get a clue, then come back.
What happened to Nvidia’s GF6x00 MPEG4 HD hardware acceleration being programmable? Oh well, another “future proof” device letting the consumer down.
After Nvidia messed up the initial implementation of its existing MPEG4HD HA on NV40 cards I can understand how they might not be so kind to jump in first this round.
> Quit spreading FUD. ATI hasn’t had driver quality issues for at least two years now, possibly three.
I guess you have not tried running Linux with ATI driver? You’re maybe right about ATI windows driver quality, but there is no quality in Linux drivers.
>I guess you have not tried running Linux with ATI >driver?You’re maybe right about ATI windows driver quality, >but there is no quality in Linux drivers.
Looks like ATI’s driver development focus in not linux but indeed _windows_..
The good thing about this is that you can code your own FREE driver on your FREE time for your FREE os. Great! Get hacking!
But it is a drain.
My 1.25Ghz Mac laptop manages it very well, but it’s not happy doing other things at the same time (at large frame sizes anyway)
Also, having custom hardware helping with the decoding over generic processors doing the work consumes less power as it is far more effiecient.
D.
They cotinue to lose my money by not providing quality Linux drivers.
Hey, it’s Linux. Do it yourself, right?
I think ATI drivers sucks. & do it very much.
/happy user of Matrox cards with best drivers
> Hey, it’s Linux. Do it yourself, right?
You want to subscribe for most stupid OSNews post ever competition? Both ATI and nVidia divers are both closed-source and undocumented (therefore untested, unreliable and often buggy). Reason is probably that in GPL version we could see 3D benchmark cheating stuff. Only difference is that nVidia provides binary x86 Linux drivers that (for many people) works, but just as on Windows, can freeze complete system. Next time educate yourself before posting.
They sell hardware.
Now, I’ve heard that Linux users use hardware. More specifically videocards, just like the kind ATI sells.
..Do you see where this is going?
ATI has lost and will continue to lose on average $100/year due to me purchasing the brand with the better drivers.
Now, I’m only one person. Lets times that by 3 million and some random factor…In summary there is money to be made.
If there wasn’t why is NVIDIA bothering writing drivers?
[Since I like NVIDIA’s drivers, my chipset/intaudio/intlan/video are all NVIDIA products.]
Quote from Article
“however ATi is not the first manufacturer to offer this specific feature.”
Who was the first? Was it PowerVR (MVDA2 or MVED1)?
What other video card (chipsets) will offer this feature?
(I realize ATI & Nvidia are the only video cards that seem to count, but I would still like to know).
The good thing about this is that you can code your own FREE driver on your FREE time for your FREE os. Great! Get hacking!
What a most brilliant idea! I bet noone has ever thought of that before. Now, why don’t you go get ATI to publish the specifications so it’s actually possible?
Most VGA cards come with 3D acceleration. For non-gamer the 3D acceleration is as usefull as lipstick on pig to them. Yet they are buying 3D cards because 1)they are similar price to non-3D one 2) non-3D card sucks on even 2D display(e.g. no hw overlay). Offloading video decoding from CPU at least make the GPU worth the $$.
Linus is currently using a Mac for developing Linux. He switched from x86 to Mac about a year ago. Don’t expect him to write a driver for any x86 video cards anytime soon.
“I think ATI drivers sucks. & do it very much. ”
ABS agree. ATI idiotic “control-center” need video hw accel, or else it crash. ws2k3 have video hw accel off by default; you need to run the “control-center” to enable accel. The result is wait/crash/restart loop.
I wonder when will hardware manufacturer stop f@cking gimmicks on the driver and instead make stable, minimal, standalone exe driver and control interface.
Linus is currently using a Mac for developing Linux. He switched from x86 to Mac about a year ago. Don’t expect him to write a driver for any x86 video cards anytime soon.
Did he ever code a driver for a video card?
He got a mac for free, and it’s not because someone has a mac that he instantly throws his x86 away. I didn’t.
Well, 1080p H.264 video doesnt play well at all on my 1.42GHz Mac Mini. 720p is almost playable. And on the apple-site it says H.262 HD requires a G5 CPU.
He’s not using a Mac, he’s using a G5 box powered by Linux. I don’t see how anyone could call that a Mac.
ATI idiotic “control-center” need video hw accel, or else it crash. ws2k3 have video hw accel off by default; you need to run the “control-center” to enable accel. The result is wait/crash/restart loop.
I’m not sure about Win2k3 but on XP ATi has very solid drivers. I work for a game distributor and watch the support tickets for customers who call in with problems playing computer games.
The biggest problem card we see is the Intel integrated Extreme garbage, the 810E and 82845G.
Next up would be nVidia GeForce cards. Now granted most nVidia problems are solved with a driver update but thats just it, its a constant job to update an nvidia card to make sure it works with all the new games and a few standout games actually require older nVidia drivers to operate correctly.
Our least problematic card ? ATi. I think I have seen 2 ATi issues in 7 months. One of which was a customer who never installed the driver to begin with.
Oh and Win2k3 is a server OS so its not surprising the hardware accel is turned off.
“Most of you are completely missing the point. The point here is that you need a VERY powerful CPU in order to decode H.264 video at full framerate (3.0 GHz P4 being a bare minimum I believe). Most consumer PCs aren’t powerful enough to decode this — and when you bring HTPCs into the question, it becomes obvious that hardware-accelerated H.264 is a good thing.
Many people use low-power CPUs such as the C3 for simple HTPC applications. A C3 can’t decode anything close to H.264 — with this it can, and still have many clock cycles left over.”
Well, I’ve been able to watch the HD trailors on Apple’s site with my lowly 1Ghz G4 iMac running Tiger with no discernable frame skipping once the movie is completely downloaded (playback is choppy while the video is being transferred).
Yes. If someone gave me a dual G5 Mac I would use it too. I don’t know if I would keep OS X though.
The driver alone itself is fine. Driver alone don’t give access to many hidden hardware features via standard device management interface. What’s problematic is “control center” and various other explorer addons and applets. “control center” _REQUIRE_ hardware acceleration, and hardware acceleration can only be activated via “control center”,
so there is a deadlock here.
Seriously, WTH they put so much gimmick/eye candy in the “control center” so that it requires hw accel? The whole point of having a control center there is for tweaking hardware features, not for seeing the CG girl. Also having device tweaker as an explorer hook is insane; “control center” crash will bring down explorer as well. These type of apllication should alway be standlone and minimalistic.
“Simple is beautiful” is always ture for device driver developement.
My Athlon64 (64-bit gentoo) has to be overclocked to ~2.5 ghz in order to get smooth playback of the Serenity 1080p trailer. This is WITH frame dropping enabled. The mplayer (cvs version from about 5/25) console output reports usually about 3-5 instances of dropped frames over the course of the 2 min trailer. CPU only decoding of hd h264 (1080p esp)takes some VERY serious hardware.
In Windoze 2K3, every multimedia feature is turned off by OS default. That’s not freaking driver’s fault.
You have absolute no idea what you’re saying, don’t you? Stop spreading FUD or trolling some stupid things w/o thinking carefully, sil vous plait?!
If you had some form of hardware acceleration (not necessarily H.264 per say, but hardware video acceleration), then you wouldn’t need the CPU running at 2.5 GHz. I can do 1080p just fine on an Athlon XP @ 2 GHz and X800 XT PE.
@linear: Evidence of this benchmark “cheating” you speak of? Thought so. You have none.
@Dan: A 3 GHz P4 is dog-slow considering its inflated clock speed. I’m sure that a ~1.4 GHz G4 comes very close to a 3 GHz P4, especially if you throw AltiVec into the equation.
@Roscoe: Three million, multiplied by a random factor? A little optimistic, don’t you think? There is no chance that there are three million Linux users out there that care enough about 3D acceleration/know about 3D acceleration and in turn do not buy ATI cards. You’re forgetting that most Linux users are just 14-year-old kiddies who install Mandrake because they hear it’ll make them a l337 hax0r. I don’t estimate the number of Linux users who are concerned about hardware acceleration/drivers for ATI cards to be more than 100,000. The cost of developing, testing, and maintaining drivers for Linux overshadows the money ATI has to gain from doing such a thing.
Whether you like to admit it or not, Linux on the desktop is insignificant when it comes to raw numbers. There is no money in developing drivers for it — and display drivers are quite possibly the most complex kind to develop.
It’s the same reason Amiga got behind, relying on too much integrated chip. It’s the reason why you are not close to see voxel or ray tracing video card. Nobody will develop those because consumer would already have an “integrated” good enough solution already.
No, the reason it got behind was that it didn’t develop proper support for new hardware technologies or different, already very common ones, such as chunky pixel displays, which were very important in games like Wolfenstein 3D and Doom, so it had to rely on heavy software conversion routines to handle this. The result? Up to 8 times as much CPU work for the same game as on even a mediocre PC with a standard VGA graphics card with a CPU at the same level as a 68030 at 50 MHz. It took 2-3 years before the Amiga caught up with everyone else who had chunky pixel capabilities, because they needed to develop very smart and fast chunky-to-planar conversion routines to get proper speeds in Doom style first person shooters.
That everyone else had proper hardware support for chunky pixel displays, simply left the classic Amiga in the dust.
Had it been about MPEG playback, it would have been the same thing. It’s also going to be the same thing with h.264.
Hardware accelerated h.264 is not only a good idea, it’s essential, if you want guaranteed smooth playback of HD content without consuming most CPU power on today’s CPUs while reducing power consumption to a minimum on a standard PC. That is of course why almost every modern graphics card out there support MPEG and MPEG2 playback in hardware and why most PCs within the past 4-5 years whether low- or highend can play DVDs smoothly out of the box. It’s simply stupid to need a Dual G5 Powermac for something as “trivial” as movie playback.
“not necessarily H.264 per say”. But h264 is the codec in question here. 1080p HD resolutions are easily decodable with mpeg2 or asp mpeg-4 (xvid, divx) codecs even without acceleration from the video card with 1.5+ ghz cpu (athlon/apple speeds, ~2.5 ghz p4 speeds). But with h264 (aka mpeg-4 avc) the better compression rates (~30% better on average) requires a lot more intensive decoding work. Thus, to get internet streamable hd videos, you have to have the higher compression levels provided by h264, and thus, either specialized decoding hardware, or a hella fast cpu.
Back in the first days of DVD playback on PC’s, even a higher end pentium2 450mhz could not play a dvd without dropped frames if you didn’t have some kind of video acceleration for it (motion compensation in geforce 2+ was the most common). But the situation slowly changed: you can now play hd resolution mpeg2 videos (at a data rate far higher than is usable over an internet connection) with minimal acceleration (yuv overlay) from the video card.
In short: specialized hardware is required now for hd resolution h264 videos, but in a couple of years, it will be considered quaint and there will be a new codec pushing resolutions higher and data rates lower requireing some insane decoding power.
Oh, ok — I see your point. 🙂
I use like… 5% CPU when watching two DVDs (at the same time, one muted), backing up over my network, and checking my e-mail.
Heh.. all in software too 🙂
BeOS.. with some goodies and replacements.. and a few things no one else has 🙂
–The loon
loon, you the man.
BeOS didn’t need all the dsp’s because the OS was designed properly for media manipulation. Still hanging out for Haiku to be released cause I really miss the days of BeOS and the potential of what personal computing could be.