Linked by Thom Holwerda on Thu 30th Sep 2010 23:04 UTC
Google A few months ago, Google open sourced the VP8 video codec as part of the WebM video project, to create a truly Free/free unencumbered video format for the web as an answer to the non-Free/free patent-encumbered H264 format. Today, Google launched a new image format for the web, WebP, which aims to significantly reduce the file size of photos and images on the web.
Order by: Score:
v Privacy invasion included
by jkimball4 on Thu 30th Sep 2010 23:13 UTC
RE: Privacy invasion included
by pabloski on Fri 1st Oct 2010 10:07 UTC in reply to "Privacy invasion included"
pabloski Member since:
2009-09-28

They can do the same now. The image formats used on the internet have open specs, so they can rape your privacy NOW!!!

They don't need a new image format to know what you're doing.

Reply Score: 2

RE: Privacy invasion included
by Laurence on Fri 1st Oct 2010 11:08 UTC in reply to "Privacy invasion included"
Laurence Member since:
2007-03-26

Likely to make it easier for Google to parse the format, identify you and rape your privacy in one solid format.


All that information is easily parsed already from document tags.

Even that aside, advances in image and character recognition means that the image format is irrelevent when scanning for usable information

Reply Score: 3

v LOL!
by poundsmack on Thu 30th Sep 2010 23:18 UTC
RE: LOL!
by vege on Thu 30th Sep 2010 23:25 UTC in reply to "LOL!"
vege Member since:
2006-04-07

You do realize that the sample image is a lossless PNG shot of the WebP image, so you can check it out quality-wise in your current browser.

Not even Chrome supports WebP rendering yet.

Reply Score: 17

RE[2]: LOL!
by Luminair on Sat 2nd Oct 2010 03:55 UTC in reply to "RE: LOL!"
Luminair Member since:
2007-03-30

Why do people say "You do realize" when they know the person didn't realize. You don't need to be a bitch about it

Reply Score: 3

RE[3]: LOL!
by vege on Sat 2nd Oct 2010 04:05 UTC in reply to "RE[2]: LOL!"
vege Member since:
2006-04-07

Well, no harm was meant here, excuse me if it seemed so.
(I am not a native English speaker, and as such, I do not have a good sense of wording and phrases.)

In case you find it offensive, pls just take it like it was "Note, that ..."

Reply Score: 2

RE[4]: LOL!
by Kalessin on Mon 4th Oct 2010 22:18 UTC in reply to "RE[3]: LOL!"
Kalessin Member since:
2007-01-18

Oh, it's perfectly correct and normal English. It's just that when you think about what you're really saying it's a bit odd. It's like when someone says "here goes nothing" before doing something. Quite obviously, they aren't about to do "nothing," but that's what they say. There are plenty of common phrases out there that are pretty silly when you actually think about what they mean literally, and yet they're perfectly common and correct English.

Reply Score: 1

RE: LOL!
by lemur2 on Thu 30th Sep 2010 23:51 UTC in reply to "LOL!"
lemur2 Member since:
2007-02-17

anyone else click the "comparison" link and watched as the new format images loaded 2 times slower than the JPEG ones? hahaha. i did this in Opera and got a huge kick out of that. ;)


Your mirth is misplaced.

The pictures you downloaded are lossless PNG format, not WebP. You don't have any software which can display WebP on your machine, so the only way that Google can show you the quality of a WebP image is to take an uncompressed image, encode it in WebP, then re-encode it in a lossless format that you can display.

Edited 2010-09-30 23:52 UTC

Reply Score: 11

RE[2]: LOL!
by poundsmack on Fri 1st Oct 2010 04:13 UTC in reply to "RE: LOL!"
poundsmack Member since:
2005-07-13

well that's what i get for not reading it or paying any attention to what is directly in front of me.

Reply Score: 1

RE[3]: LOL!
by Boldie on Fri 1st Oct 2010 14:42 UTC in reply to "RE[2]: LOL!"
Boldie Member since:
2007-03-26

2 years ago I got my KDE-fanboyism whipped out of me after being to quick to draw a conclusion. Not pleasant. The OSnews crowd can be hard but is usually fair. ;)

Reply Score: 2

RE[4]: LOL!
by poundsmack on Fri 1st Oct 2010 20:25 UTC in reply to "RE[3]: LOL!"
poundsmack Member since:
2005-07-13

it's weird being the target of the mob for a change. I got used to being in the side with torches and pitchforks ;)

Reply Score: 4

RE[2]: LOL!
by David on Fri 1st Oct 2010 16:38 UTC in reply to "RE: LOL!"
David Member since:
1997-10-01

Your mirth is misplaced.


You get my vote for the politest and most civilized smackdown in the history of the internet.

Reply Score: 7

Comment by mercury
by mercury on Fri 1st Oct 2010 00:02 UTC
mercury
Member since:
2009-01-24

Most are pretty similar but there are a few exceptions that are less then subtle.

The photo of the NFL player (image 2) looks more saturated with the background changing from blue to aqua and the skin tone redder.

Image 7 goes from magenta to blue especially around the shoreline and piers.

Reply Score: 2

RE: Comment by mercury
by umccullough on Fri 1st Oct 2010 00:17 UTC in reply to "Comment by mercury"
umccullough Member since:
2006-01-26

The photo of the NFL player (image 2) looks more saturated with the background changing from blue to aqua and the skin tone redder.


I saw that in the thumbnails also - but when i opened both pictures and made sure my browser wasn't scaling them to the window size - then flipped back and forth between them - it wasn't as obvious. Therefore, I'm suspecting the scaling mechanisms between JPG and PNG are causing the differences in the thumbnails.

I did notice the words NFL on the mic have a slightly different set of artifacts, and the detail on his head seems sharper in the JPG. It's kinda hard to make an objective comparison anyway when the source picture was already lossy.

Image 7 goes from magenta to blue especially around the shoreline and piers.


Huh? Image 6? Again, it looks a lot different when my browser scales the images than they do unscaled at full resolution.

Edited 2010-10-01 00:17 UTC

Reply Score: 3

RE[2]: Comment by mercury
by flanque on Fri 1st Oct 2010 11:49 UTC in reply to "RE: Comment by mercury"
flanque Member since:
2005-12-15

I tend to think there's an improvement in some of these images, but your point makes me wonder.

Reply Score: 2

RE[3]: Comment by mercury
by FunkyELF on Fri 1st Oct 2010 14:32 UTC in reply to "RE[2]: Comment by mercury"
FunkyELF Member since:
2006-07-26

There would be zero improvement.
The JPG was the source.

What they need to do is shoot RAW and export to TIFF or PNG. From the TIFF/PNG convert to both JPG and this new format.

That would be a comparison.

What they're showing here is that you can compress already compressed files at a loss (although seemingly small).

Reply Score: 3

RE[4]: Comment by mercury
by dagw on Fri 1st Oct 2010 20:51 UTC in reply to "RE[3]: Comment by mercury"
dagw Member since:
2005-07-06

Realistically thought compressing already processed jpegs is probably the most likely use case for the web. I have 20 thousand or so jpegs in my htdocs folder right now. If I can run a script over them and re-compress them in a new format and get smaller files without too much loss of quality them I'm totally interested. If on the other hand I have to go back to the unprocessed original and start from to get any benefits, then I kind of lost interest.

Reply Score: 4

RE[4]: Comment by mercury
by flanque on Sat 2nd Oct 2010 00:36 UTC in reply to "RE[3]: Comment by mercury"
flanque Member since:
2005-12-15

Well they do appear in certain spots to be less noisy.

Reply Score: 2

RE[5]: Comment by mercury
by chiefrain on Sat 2nd Oct 2010 22:58 UTC in reply to "RE[4]: Comment by mercury"
chiefrain Member since:
2010-10-02

You don't get the point: The mission of image compression is to reduce the size of the image with a minimal loss of information.
You may talk about different kinds of information loss, and you may prefer some kinds of loss over other kinds of loss. But you cannot talk about improvement. Any change of the image caused by the compression technique - whether you find it improving or not - is a change, and thus it is bad.
If you want to "improve" an image, you use other techniques (sharpening, blurring, scratch detection, etc.). And then you get a new "original" that you may try to compress with different compression techniques.

Reply Score: 1

RE[6]: Comment by mercury
by flanque on Sun 3rd Oct 2010 05:41 UTC in reply to "RE[5]: Comment by mercury"
flanque Member since:
2005-12-15

Again, some images in some spots look better to me - less noisy.

I don't have to get the point of compression or image manipulation to notice improvements.

Reply Score: 2

RE[2]: Comment by mercury
by Timmmm on Fri 1st Oct 2010 13:23 UTC in reply to "RE: Comment by mercury"
Timmmm Member since:
2006-07-25

By "Wasn't as obvious" you mean "wasn't there"? I think you guys are succumbing to the audiophile effect ("yeah, it definitely has more clarity and depth").

If I flick between them there is zero visible difference. I checked in matlab and the difference is really really really small (actually over half the pixels are identical).

Reply Score: 2

RE[3]: Comment by mercury
by umccullough on Fri 1st Oct 2010 20:49 UTC in reply to "RE[2]: Comment by mercury"
umccullough Member since:
2006-01-26

By "Wasn't as obvious" you mean "wasn't there"? I think you guys are succumbing to the audiophile effect ("yeah, it definitely has more clarity and depth").

If I flick between them there is zero visible difference. I checked in matlab and the difference is really really really small (actually over half the pixels are identical).


Well, yeah - it wasn't visible enough to make a difference to me. Furthermore, if I saw them side by side, they would be identical. But when you're swapping back and forth between the two images in the exact same window you start to see very minor pixel-level differences. Nothing I would consider a "deal breaker" though. Mainly I was pointing out that the scaled down thumbnail versions did indeed display more differences than I saw when I viewed the full size images - and indicated that I believe the browser scaling for different image formats may be different.

The worst part is that the JPG was used as the source for the WebP - that was a poor choice for producing comparison images on.

Edited 2010-10-01 20:50 UTC

Reply Score: 2

RE: Comment by mercury
by tyrione on Fri 1st Oct 2010 01:45 UTC in reply to "Comment by mercury"
tyrione Member since:
2005-11-21

You're right. You can see the averaging trying to blend where instead of a random scatter pattern via the JPEG you now see groupings of non-linear shapes to cut down on the file size.

The same with his forehead.

Reply Score: 2

Ah, why 3 clause...
by TheGZeus on Fri 1st Oct 2010 00:53 UTC
TheGZeus
Member since:
2010-05-19

The format will forever be associated with Google, and any derivatives will be able to refer to the WebP format as its origin, so why do that?
Just MIT the thing...

Reply Score: 2

What? No love for JPEG2000?
by tyrione on Fri 1st Oct 2010 01:43 UTC
tyrione
Member since:
2005-11-21

How come Google isn't comparing it against JP2?

Reply Score: 2

RE: What? No love for JPEG2000?
by Valhalla on Fri 1st Oct 2010 02:08 UTC in reply to "What? No love for JPEG2000?"
Valhalla Member since:
2006-01-24

How come Google isn't comparing it against JP2?


Because this format is aimed at the web and thus 'compete' against standard jpg. Do you see many jpeg2000 images on the web? No, because they are computionally very expensive for what is roughly 20% better compression. Personally I think it's great for archiving high resolution images, not so much for web surfing.

Reply Score: 5

Slambert666 Member since:
2008-10-30

Quoting one of the commentators on the site:

The examples page is lame... you can take the same jpeg images and save them with a higher compression level and get effectively the same reduction in file size.

For example I took "10.jpg", a 1.1 meg file, adjusted the jpeg compression and got it down to 189k with no visible loss of quality. That's an over 80% reduction in file size and I didn't have to change the file format.

The Web doesn't need a new file format, especially one that doesn't really do anything substantively different. WebP is no different than JPEG with a higher compression setting as the default.


So WebP does not even fare well against standard jpeg...

Reply Score: 6

bnolsen Member since:
2006-01-06

Possibly quite correct. The DCT is a pretty damn good image transform. The problem with it is computational cost and the ijg libraries frankly suck.

For something like WebP to get any traction is if it's even simpler and easier to implement and computationally more efficient. Right now the google pages about webp seem to be out of commission so I can't look into this part myself.

Reply Score: 2

fithisux Member since:
2006-01-22

There is always DWT

Reply Score: 2

bnolsen Member since:
2006-01-06

jpeg2k is a pretty horrific over engineered difficult spec to code and it's computationally even more expensive than jpeg. Quality wise jp2 isn't clearly better than jpeg, just different with tradeoffs in what artifacts you get.

Reply Score: 2

Valhalla Member since:
2006-01-24


So WebP does not even fare well against standard jpeg...

Well, unless you have decided that this particular random guy on the interwebs are ABSOLUTELY CORRECT in his assessment despite only offering his own subjective perception then that comment really decided nothing. I'm looking forward to a real test by some experts, preferably using non-lossy compressed media to begin with. But even if webp turns out to be alot more efficient than jpg in terms of size/quality I think it's going to be really hard to make a dent in jpg's dominance on the web. Heck, even gif files are still in heavy use despite png being a superior format and even at it's heyday gif was nowhere near jpg in terms of widespread usage (I am old enough to remember having fuzzy dithered gif porn images in my youth, kids today don't know what we oldtimers had to suffer through ;) ).

Reply Score: 3

Slambert666 Member since:
2008-10-30

Well, unless you have decided that this particular random guy on the interwebs are ABSOLUTELY CORRECT in his assessment despite only offering his own subjective perception then that comment really decided nothing.


You are of course 100% correct in that observation, but please remember that is is Google themselves that started this nonsense with comparing some random samples and made a gallery biased to make it look like WebP is "much" better than jpeg.
If in fact it is not much better but just a little better, would you use it?

Reply Score: 3

Neolander Member since:
2010-03-08

Heck, even gif files are still in heavy use despite png being a superior format

1/There isn't a single supported standard for animated PNG across all browsers.
2/Most PNG encoders bundled in image editors aim for quality and don't support artificially enforcing use of limited color palettes, so in the end you can make GIF much smaller than PNG when it's needed ;)

Edited 2010-10-01 07:30 UTC

Reply Score: 3

Almafeta Member since:
2007-02-22

JPEG is a pretty awesome format, all told. It's just that it's not used appropriately in 95% of all cases, and many of the programs that use it don't expose all the features of the format.

Reply Score: 1

Laurence Member since:
2007-03-26

Quoting one of the commentators on the site: " The examples page is lame... you can take the same jpeg images and save them with a higher compression level and get effectively the same reduction in file size. For example I took "10.jpg", a 1.1 meg file, adjusted the jpeg compression and got it down to 189k with no visible loss of quality. That's an over 80% reduction in file size and I didn't have to change the file format. The Web doesn't need a new file format, especially one that doesn't really do anything substantively different. WebP is no different than JPEG with a higher compression setting as the default.
So WebP does not even fare well against standard jpeg... "

The problem with that post is that a JPEG set to highest compression does look very crappy. I don't care what this internet anom stated, it's very noticable.

So yes, you can compress JPEG to ~80%, but there's a massive trade off in image quality. Much like MP3 compression really and such is life with any lossy compression formula.

Reply Score: 3

RE: What? No love for JPEG2000?
by MechR on Fri 1st Oct 2010 05:49 UTC in reply to "What? No love for JPEG2000?"
MechR Member since:
2006-01-11

How come Google isn't comparing it against JP2?

They did; see here:
http://code.google.com/speed/webp/docs/c_study.html

Reply Score: 2

RE: What? No love for JPEG2000?
by pabloski on Fri 1st Oct 2010 10:09 UTC in reply to "What? No love for JPEG2000?"
pabloski Member since:
2009-09-28

Jpeg 2000? You mean the zombie? ;)

Seriously, someone here is using jpeg2000?

Reply Score: 1

WereCatf Member since:
2006-02-15

Jpeg 2000? You mean the zombie? ;)

Seriously, someone here is using jpeg2000?


Seriously, did you have some point with your comment? Why shouldn't someone use JPEG-2000? It is computationally more expensive, yes, but it also produces slightly less artifacting and smaller files than regular JPEG. If you are going to use a lossy format for archiving purposes you may as well go for JPEG-2000.

Reply Score: 3

bnolsen Member since:
2006-01-06

With disk space so cheap png is the real winner here for typical digital camera image storage. Kind of like flac vs mp3: you want to archive lossless and re-encode as needed for whatever new portable device you have.

Reply Score: 3

troy.unrau Member since:
2007-02-23

I do, but I use it in lossless compression mode. In planetary science, it's a particularly useful format as it allows you to view segments very large images at different zoom levels simply by evaluating different chunks of the image file. This is a huge advantage of using the DWT, especially when the image sizes grow to be overly large.

eg: HiRISE images ( http://hirise.lpl.arizona.edu/ ) come in at about 2.5 Gigapixels; jpeg2000 is perfect for viewing pieces of the image at different zoom levels.

That said, implementing jpeg2000 for web photos would be silly, as the web is currently designed. Pretty much all images on the web are viewed at 100% zoom; should that change, jpeg2000 would be useful. As it is, other algorithms are faster than the DWT used.

Reply Score: 2

When will IE adopt it natively?
by Liquidator on Fri 1st Oct 2010 05:16 UTC
Liquidator
Member since:
2007-03-04

I'm afraid IE users will see an "X" instead of image encoded in this format for decades to come...

Reply Score: 6

FunkyELF Member since:
2006-07-26

Javascript?

You'll have to store both formats of the picture but you'll save on bandwidth.

Then you can also put a watermark on the IE version that says "Use a modern browser". Come on, the format is already a day old.... why isn't it supported already?

Reply Score: 3

Laughing
by deathshadow on Fri 1st Oct 2010 07:11 UTC
deathshadow
Member since:
2005-07-12

That it's in a RIFF container, when Chrome won't even play raw unencoded PCM in a .wav file with it's AUDIO tag...

Which is just a RIFF container with raw data in it... and here I thought they were into that whole "RIFF comes from the evil empire" mentality on that. Kinda surprised they didn't just gut matroska for it like they did with webM.

Lemme guess, right hand knows not the left?

Edited 2010-10-01 07:16 UTC

Reply Score: 3

RE: Laughing
by bnolsen on Sun 3rd Oct 2010 04:43 UTC in reply to "Laughing"
bnolsen Member since:
2006-01-06

Looking at wikipedia RIFF was created after AIFF (apple). The one difference being RIFF is little endian and AIFF is big endian.

Reply Score: 2

Talking about optimisation ...
by matako on Fri 1st Oct 2010 08:03 UTC
matako
Member since:
2009-02-13

It is easy to achieve 20%-40% JPEG size reduction simply by optimising quality and size for a particular page/content.

Reply Score: 1

There are better formats already
by hyper on Fri 1st Oct 2010 08:24 UTC
hyper
Member since:
2005-06-29

Ah... Google "doing whats best for all". Except that we do not need yet another format...

We already have JPEG2000 and JPEG XR: http://en.wikipedia.org/wiki/JPEG_XR

Technically i think JPEG XR > JPEG2000 > JPEG quality-wise and WebP is probably worse than any of these except maybe JPEG.

I don't care if WebP "is open and free". In that case we have JPEG which is good enough. Either we all change to something vastly superior that JPEG (i.e. JPEG XR) or we do change anything at all.

Just my thoughts...

Reply Score: 8

FealDorf Member since:
2008-01-07

Exactly. Also, JPEG XR has other features such as lossy+lossless compression and HDR imaging while being OSS-friendly. WebP's only advantage is that google owns the IP -- whether they wish to use it or not -- while JPEG-XR's IP is owned by Microsoft.

Reply Score: 2

Valhalla Member since:
2006-01-24

Exactly. Also, JPEG XR has other features such as lossy+lossless compression and HDR imaging while being OSS-friendly.

Microsoft released JPEG-XR under a licence that intentionally prohibits use in copyleft (GPL) licenced works.

Reply Score: 3

FealDorf Member since:
2008-01-07

That's for the library they released. Not the specification itself. It however permits BSD licenced software, also I believe it can work with LGPL (I'm not sure of this though).

Reply Score: 1

Comment by hornett
by hornett on Fri 1st Oct 2010 08:43 UTC
hornett
Member since:
2005-09-19

Not sure why there is so much moaning about this.

Take a look at the final pic, less artefacts (look around the nose on the boat) and it is 66% smaller! Amazing!

Moaning about this is like moaning that h.264 is better than DIVX.

Reply Score: 3

RE: Comment by hornett
by Kroc on Fri 1st Oct 2010 08:48 UTC in reply to "Comment by hornett"
Kroc Member since:
2005-11-10

Might want to read this: http://x264dev.multimedia.cx/?p=541

Reply Score: 5

RE[2]: Comment by hornett
by Neolander on Fri 1st Oct 2010 10:17 UTC in reply to "RE: Comment by hornett"
Neolander Member since:
2010-03-08

You know, I'd tend to trust devs working on H.264 technology talking about their competitors just as much as Xiph devs talking about H.264.

Just sayin'...

Reply Score: 4

RE[3]: Comment by hornett
by FealDorf on Fri 1st Oct 2010 11:44 UTC in reply to "RE[2]: Comment by hornett"
FealDorf Member since:
2008-01-07

But trusting Google is no better, who'd definitely flaunt the format with exaggerations

Reply Score: 2

RE[4]: Comment by hornett
by Neolander on Fri 1st Oct 2010 12:28 UTC in reply to "RE[3]: Comment by hornett"
Neolander Member since:
2010-03-08

But trusting Google is no better, who'd definitely flaunt the format with exaggerations

Indeed. Testing should be done by a third-party who has *no* business interest in orienting the test results.

Reply Score: 2

RE[3]: Comment by hornett
by Fettarme H-Milch on Fri 1st Oct 2010 14:14 UTC in reply to "RE[2]: Comment by hornett"
Fettarme H-Milch Member since:
2010-02-16

You know, I'd tend to trust devs working on H.264 technology talking about their competitors just as much as Xiph devs talking about H.264.

Just sayin'...

Competitor? The very same guy also wrote a VP8 decoder for the ffmpeg project from scratch.
He's an active contributor to both worlds. I don't see him competing with his own software.

If you don't trust him, feel free to repeat his tests and compare the results.

Reply Score: 6

RE[4]: Comment by hornett
by Valhalla on Fri 1st Oct 2010 20:57 UTC in reply to "RE[3]: Comment by hornett"
Valhalla Member since:
2006-01-24


Competitor? The very same guy also wrote a VP8 decoder for the ffmpeg project from scratch.
He's an active contributor to both worlds. I don't see him competing with his own software.

If you don't trust him, feel free to repeat his tests and compare the results.


He is a competitor. He is pushing X264 as the web standard for video, and he and other x264 devs are trying to get permission to dual-licence x264 so that they can charge money for propriety projects that want to incorporate x264. The fact that he and some other developers (he wasn't alone) wrote a vp8 decoder for the ffmpeg project does not change this.

No, good old Dark Shikaru has a little too much personal interest in x264 vs webm/webp for my taste. Also, why the heck did he use a motion video frame as source? Smells like a tailored test methinks.

I'll look forward to totally independant comparisons using a wide range of raw images (so that it won't favour either encoder) which can then be compared quality-wise between jpg and webp when encoded to the (near as possible) same file size.

Like I said earlier though, even if webp would prove to be a better format in terms of size/quality than jpeg I seriously doubt it will gain any traction. Jpeg dominates the web by being supported everywhere and being 'good enough' in terms of quality/size. I believe a new format would have to be so much better (quality/size, no submarine patent threat, permissive licencing) that it's simply a no-brainer to do the switch from jpeg even with the pain of transition, and I really don't see that webp is or ever will be that. Time will tell.

Reply Score: 3

RE[5]: Comment by hornett
by Gusar on Fri 1st Oct 2010 21:35 UTC in reply to "RE[4]: Comment by hornett"
Gusar Member since:
2010-07-16

Dark Shikari doesn't care about "web video" per se, he's just interested in high quality.

And btw, x264 is *already* dual licensed and being sold. Lots of companies want it. And you know why? Because it's simply that good.

As another commenter said, if you don't trust DS results, do your own test. These conspiracy theories regarding DS here at osnews are quite silly to me.
(if that last sentence gets this comment downvoted, so be it. won't chance the fact that by constantly accusing DS of bias, you're doing nothing but actually showing *your* bias)

Oh, and regarding using a motion picture... here's what Dark himself said (yes, it's advisable to read all the comments to his blog post):
"That video is taken on 65mm film by a camera that costs more than most houses — it is higher quality than almost any image taken by any “photo camera”. I highly doubt your average Creative Commons images even have a quarter the detail that an Arriflex 765 can take."

Edited 2010-10-01 21:39 UTC

Reply Score: 3

RE[6]: Comment by hornett
by Valhalla on Fri 1st Oct 2010 22:29 UTC in reply to "RE[5]: Comment by hornett"
Valhalla Member since:
2006-01-24

Dark Shikari doesn't care about "web video" per se, he's just interested in high quality.
If h264 becomes the 'web standard' he and other x264 devs will stand to make more money from licencing x264 then they would otherwise.

so be it. won't chance the fact that by constantly accusing DS of bias, you're doing nothing but actually showing *your* bias)

Again, he has money to make on the success/dominance of h264, I am always VERY sceptical when people with a monetary interest claim their technology is much better than the competition.

Oh, and regarding using a motion picture... here's what Dark himself said (yes, it's advisable to read all the comments to his blog post):
"That video is taken on 65mm film by a camera that costs more than most houses — it is higher quality than almost any image taken by any “photo camera”. I highly doubt your average Creative Commons images even have a quarter the detail that an Arriflex 765 can take."

Well, A) that is his words B) it is ONE image, hardly makes for a serious study by any measurement. Also, I wonder what codec the video was encoded in, if it was in h264 then it would seem logical that in reencoding a still frame from it x264 would be favored since it uses the same compression technique as the source (note, I do not know what codec was used for the original video from which the frame was captured, I am just assuming it was h264, I may be dead wrong here). Also, why did he use jpgcrush on the jpg image? It seems to me that he wanted to be able to use a higher quality setting for the jpg while keeping it at the same file size as the webp file. I don't know exactly what jpgcrush does (other than optimizing the file size obviously) but I guess the added compression time from using that is why it's not normally part of standard jpeg compression and chances are the same techniques used in jpgcrush could be applied to a webp image to make it smaller while keeping the same quality. Again, this test smells tailored to me. Again, I will await a serious study using a wide range of non-lossy images.

Reply Score: 3

RE[7]: Comment by hornett
by Gusar on Fri 1st Oct 2010 22:55 UTC in reply to "RE[6]: Comment by hornett"
Gusar Member since:
2010-07-16

He's already swamped with the orders for commercial licensing of x264, h264 being "web video standard" or not.

For the rest... as I said, make your own test. That's the only way you'll be sure. When WebM was released and DS made his blog post, people here were also accusing him of all sorts of things. But instead of coming up with conspiracy theories, I took my favorite clip (chapter 3 of Serenity) and encoded it with various encoders (x264, libvpx, xvid, theora-ptalarbvorm). My conclusion from that test was that libvpx totally and completely blows. And since it's the same libvpx making these images, I've no doubt in DS's results.
But don't take my word for it, or even DS's. Just *do your own test*.

Also, why go "wondering" about the source video? DS provided all the necessary links in his post. I suppose accusations are easier when you skip the provided info and can depend on "wondering".

Reply Score: 2

RE[7]: Comment by hornett
by Fettarme H-Milch on Sat 2nd Oct 2010 02:39 UTC in reply to "RE[6]: Comment by hornett"
Fettarme H-Milch Member since:
2010-02-16

Well, A) that is his words B) it is ONE image, hardly makes for a serious study by any measurement.

Disprove him or shut up.
Your FUD spreading is lame. Go get a camera that can generate RAW files, go out, shoot photos, and compare WebP vs. JPEG.

Reply Score: 2

RE[3]: Comment by hornett
by vaette on Mon 4th Oct 2010 09:15 UTC in reply to "RE[2]: Comment by hornett"
vaette Member since:
2008-08-09

It is hardly an opinion piece, he does rather in-depth technical commentary, and, more importantly, he produces photos using freely available tools. Completely reproducible, and I must agree that WebP has a hard time matching up to even JPEG in his test:
http://x264.nl/developers/Dark_Shikari/imagecoding/vp8.png
vs.
http://x264.nl/developers/Dark_Shikari/imagecoding/jpeg.png

And, as quite often noted, JPEG is more or less the worst case compression-wise these days. The introduction of WebP is a lot like trying to introduce a new audio compression method with the argument that it beats MP3, while failing to match Vorbis and AAC. Sure JPEG is the standard choice on the web, but not because no one else has beat it on quality.

WebM remains a good thing to have around (though its greatest victory already happened when MPEG-LA loosened the h264 licensing deal for web streaming in direct response). WebP seems rather unnecessary though.

Really heartening in some ways to see his link at the end, where Theora through its years of retuning actually does a much better job than VP8 at this task:
http://x264.nl/developers/Dark_Shikari/imagecoding/theora.png
I am hardly a huge Theora fan, but hats off to the Xiph guys for their hard work.

Edited 2010-10-04 09:17 UTC

Reply Score: 1

RE[2]: Comment by hornett
by hornett on Fri 1st Oct 2010 13:50 UTC in reply to "RE: Comment by hornett"
hornett Member since:
2005-09-19

Quite a damning review!

I take my comments back for the time being (although I suppose there is is nothing to say that the compressor cannot be tuned/have more psy optimisation added in order to give better detail).

Edited 2010-10-01 13:53 UTC

Reply Score: 2

RE: Comment by hornett
by chrisfriberg on Fri 1st Oct 2010 13:42 UTC in reply to "Comment by hornett"
chrisfriberg Member since:
2009-04-08

Funny you mention that. How is it possible that source image has more compression artifacts than their WEBP version? I noticed this on image 5. Control-+ until the images are enlarged and look on the left at the yellow bricks. On the JPEG version you can't make out the individual bricks, but on the WEBP version you can. I noticed this on most images here. Most JPEG edge artifacts are gone on their WEBP counterparts. Even if the originals are just scaled down, you should still always see less detail on recompressed images.

Reply Score: 1

RE[2]: Comment by hornett
by Glynser on Fri 1st Oct 2010 15:15 UTC in reply to "RE: Comment by hornett"
Glynser Member since:
2007-11-29

I guess they didn't use the JPEG on the left hand side to create the WebP on the right, I guess they just used the same source picture to create both.

EDIT: Apart from being a logical explanation, this would also be the thing that makes the most sense, as they want to compare both codecs with each other.

Edited 2010-10-01 15:17 UTC

Reply Score: 1

smitty
Member since:
2005-10-13

commenting about how you can get similar compression rates from JPEG.

http://code.google.com/speed/webp/docs/c_study.html

Google took a random sampling of images from the web, then re-encoded all of them as JPEG, JPEG2K, and their new WebP format, all targeting the same PSNR rate of 40. Google's format was significantly better, especially at the smaller images.

Now, that doesn't mean this format is an automatic win. How important is another 20% compression compared to the problems associated with supporting a new format? How CPU intensive is the decoding/encoding process? etc. A 3rd party test/comparison would also be good, just to double check that google didn't forget anything important. One good thing is that most of the format is defined in VP8 already, so browsers will already have most of the code in them, it should be just a matter of hooking it up to the correct places to display the images.

Edit:
Another thing to be aware of, as dark shikari has pointed out VP8 is optimized for PSNR. He shows that that's bad for quality, but it also might have twisted the results of Google's test a bit since they were targeting a constant PSNR of 40 in all 3 formats.

Edited 2010-10-01 08:56 UTC

Reply Score: 5

Timmmm Member since:
2006-07-25

Yeah but why on earth did they use JPEGs as the input, and optimise for PSNR?

Also in their comparison gallery all the images are so loosely compressed that there are no differences at all. Totally useless.

Edit: Actually I checked by subtracting the images. They are exactly identical; they must have messed up somewhere.

Edit 2: Ahem, sorry they aren't exactly identical - the differences are nearly all less than 5 though, which is basically imperceptible. If you plot the difference image it looks pure black.

Edited 2010-10-01 13:24 UTC

Reply Score: 2

sakeniwefu Member since:
2008-02-26

This guy does a better job. He fucked the JPEGs up at first, but if he has got it right this time then it would match Google's graph and WebP would indeed look better for similarly sized files.

http://englishhard.com/2010/10/01/real-world-analysis-of-googles-we...

Reply Score: 2

Neolander Member since:
2010-03-08

Indeed, looks like it's the same thing as with independent H.264 vs VP8 comparisons : JPG has a hideous blocky rendering but keeps more details, while WebP makes more nice-looking images at the cost of some blurring.

I prefer blur myself. In my opinion, being aesthetically-pleasing as a whole is much more important for a picture that some detail in the wild disappearing. Moreover, JPEG/H264 blocks are very annoying when doing image editing (they make tools based on contour detection fail miserably). And after all, those who really want details will only be satisfied with lossless formats anyway ;)

Edited 2010-10-02 08:09 UTC

Reply Score: 3

Already available on AmigaOS 4
by KalElFr on Sat 2nd Oct 2010 19:34 UTC
KalElFr
Member since:
2010-10-02

Thanks to the Datatypes system of AmigaOS 4, the WebP format is already avalaibale and working with older browsers (IBrowse or OWB) and with more recent ones (NetSurf) but unfortunately not with OWB since the latter doesn't make use of datatypes.

Moreover, every paint programs or picture viewer can now display WebP images.

See http://www.amigans.net/modules/news/article.php?storyid=1242

Reply Score: 1

mrAmiga500 Member since:
2009-03-20

That's awesome. I don't know any other OS that can add new file support to every application just by putting a datatype file in a folder.

Reply Score: 1

Valhalla Member since:
2006-01-24

That's awesome. I don't know any other OS that can add new file support to every application just by putting a datatype file in a folder.

Well, Haiku sorta has the same ability. By writing a 'translator' for a file format all programs can automagically support that format.

Reply Score: 2

mrAmiga500 Member since:
2009-03-20

Oh yeah, you're right. I added Amiga IFF graphics support to BeOS that way... but besides Amiga and BeOS/Haiku, I don't think any other operating systems have that ability.

Reply Score: 1

And the real answer is...
by bnolsen on Sun 3rd Oct 2010 04:49 UTC
bnolsen
Member since:
2006-01-06

...higher bandwidth.

100MB/s should be common by now in the US. Sadly most homes can't even get close to 10MB/s speeds and high numbers of people don't even have 1/10 that even.

Reply Score: 2