Linked by Brooss on Tue 15th Mar 2011 23:32 UTC
Benchmarks A comment on the recent article about the Bali release of Googles WebM tools (libvpx) claimed that one of the biggest problems facing the adoption of WebM video was the slow speed of the encoder as compared to x264. This article sets out to benchmark the encoder against x264 to see if this is indeed true and if so, how significant the speed difference really is.
Permalink for comment 466333
To read all comments associated with this story, please click here.
RE: Things we already knew about
by Alfman on Wed 16th Mar 2011 04:33 UTC in reply to "Things we already knew about"
Member since:

">comparisons in quality are always somewhat subjective"

"No, they're not. Just use a pixel difference utility and do it objectively."

This test is highly appealing do to it's simplicity, however it is a little deceptive if we dig deeper.

Lossy compression algorithms deliberately throw away information we cannot perceive. This is by design, and should not be considered a fault. Creating an objective quality tests turns out to be rather difficult.

Using audio as an example, consider two cases:
Audio codec A throws away very low and high frequencies which we cannot hear.
Audio codec B replicates the waveform best on paper, but introduces other audible artifacts.

The diff test would prefer codec B, but the human ear would prefer codec A.

The nuances are even more complicated with visual media.

Consider a black screen on a white surface. A motion estimation algorithm might place the "lines" a couple pixels out of place, but never the less produce a great visual image overall. The trivial diff test would fail to recognize the quality a human observer would perceive.

">the gap never exceeds around 50% encoding time"

"Which is huge for a service like Vimeo or Youtube, that get thousands of encoding requests per minute. Time is money, and this would be a huge waste in resources."

I'm not particularly impressed by those numbers either, but we need to recognize that encoding performance may be less crucial than decoding performance and size ratios. The importance of either depend on the use cases.

You mentioned mobile applications, Today's devices have more computing power than bandwidth, so it could make sense to sacrifice one to help the other.

The other example you mentioned was youtube. If one is distributing millions of copies, then encoding speed isn't nearly as important as the decoding speed and compression ratio.

Just to emphasis the point... What if we had an algorithm capable of compressing a typical DVD to 50MB with no perceivable quality loss but it took 48hours to encode, but also played back on modest hardware? This would be of great interest to youtube, netflix, etc. The savings on bandwidth * millons would offset any inconvenience to the service providers.

I'm not familiar enough with these codecs to make any judgements, but I do believe it's likely the battle will be fought on political rather than technical grounds, assuming both codecs are "good enough".

Reply Parent Score: 11