Linked by lemur2 on Wed 9th Mar 2011 00:18 UTC
Multimedia, AV The WebM project Blog has announced an update release of the VP8 Codec SDK codenamed the "Bali" release. The Bali release was focused on making the encoder faster while continuing to improve its video quality.
Permalink for comment 465429
To read all comments associated with this story, please click here.
RE[2]: Shouldn't the heading be...
by lemur2 on Wed 9th Mar 2011 12:39 UTC in reply to "RE: Shouldn't the heading be..."
lemur2
Member since:
2007-02-17

""Google Releases New Version of VP8 Codec", seeing as though they're the only ones who can officially do anything with their "open" project...


I think the point being made here is that no one but Google (and ultimately the acquisition it came from) ever had any say in what VP8 or WebM was going to be. Whereas h.264 was a collaborative project among many companies, VP8 was created by a single company behind closed doors,
"

Up until that point, no-one would argue that VP8 was anything but closed and proprietary.

acquired by Google, released as final, and then adopted in an official capacity on the world's largest video site, also owned by Google. Given the circumstances, anyone who doesn't insist on conflating openness with freeness would have to admit that h.264 was, and likely remains, more open.


No way. WebM was opened by Google after Google's acquisition of On2, and from that point onwards the entire nature of VP8 was changed. No longer was it closed, proprietary, and developed by a single entity, from that point on it became open and community-developed.

Initial reports also held that VP8 documentation was very poor (code is not documentation), though that may have changed now. This was (is?) a real problem -- a well-documented spec allows clean implementations to be created. Well-documented, actually-open specs breed superb projects like x264 and LAME, while poorly-documented, purportedly-open specs breed placeholder projects like Gnash and Mono.


No problem with this, and a specification is indeed being written and refined. However, you cannot expect a full-blown correct specification to spring up out of thin air, it has to be developed, and this takes time.

I must reiterate (ad nauseam) that cost is a separate dimension from openness. They are independent variables that usually, but don't always, correlate. The latter does not depend on the former. (This, of course, does not fit into the populist all-or-nothing, good-vs.-evil model, where "open" has no real definition and is used instead simply to mean "good".)


No argument here ... openness has nothing to do with cost. It is not because h264 costs something that it is not open, but rather it is that h264 is not open for anyone to implement that makes it not open.

Given that the entire WebM product, top to bottom, is, in both patent and copyright, royalty free for all use cases, there's no practical disadvantage to there being only a single implementation, codebase-wise.


Whatever gives you the impression that there is only a single implementation? Already there are many. There is the reference codec, made by the WebM project, and there is ffmpeg, I believe the x264 project has another implementation of WebM, and there are a number of hardware implementations.

I think you might be getting confused by the concept of the "reference codec". This is not a single implementation, but rather it is the "gold standard" implementation. What this means is that if your decoder implementation cannot play a WebM video encoded by the reference implementation, then you are doing it wrong. By definition. Likewise if a video encoded by your implementation cannot be played by the reference implementation, then you are doing it wrong. By definition.

Get it?

But if Google's documentation is just code, then Google's code will be the only code (copyright-wise), and there's no practical advantage to being a supposedly open spec (patent-wise), except for that alluring $0 price tag, because it was (and may still be) impossible to create a copyright-clean reimplementation.


Sorry, but that is just your misunderstanding. Google's code is not the only code, it is merely, for the time being, until the specification documentation is sold and proven, the reference implementation against which other implementations must test themselves. Once another implementation tests correctly against the reference implementation, then it can be said to be a WebM implementation.

Here are some links:

http://blog.webmproject.org/2010/10/vp8-documentation-and-test-vect...

http://blog.webmproject.org/2010/08/ffmpeg-vp8-decoder-implementati...

Google's implementation is called libvpx.
ffmpeg implementation is called ffvp8.
"The ffvp8 implementation decodes even faster than the WebM Project reference implementation (libvpx), and we congratulate the FFmpeg team on their achievement. It illustrates why we open-sourced VP8, and why we believe the pace of innovation in open web video technology will accelerate."

http://www.osnews.com/story/23598/FFMpeg_s_ffvp8_the_Fastest_VP8_
Just after 3 weeks of the binary compatible vp8 decoder release, the FFMpeg team still impressing us but this time with a new benchmark of their own vp8 decoder. The new ffvp8 decoder written independently using pre-existent FFMpeg code-base is now the fastest vp8 decoder with margins going more than 30% faster than Google's official codec specially on 64bit machines.


The in-work specification:
http://www.webmproject.org/code/specs/

Hope this helps.

Edited 2011-03-09 12:59 UTC

Reply Parent Score: 3