“In Windows 8, we reimagined the browser with IE10. We designed and built IE10 to be the best way to experience the Web on Windows. With the IE10 Release Preview for Windows 7 consumers can now enjoy a fast and fluid Web with the updated IE10 engine on their Windows 7 devices. The release preview of IE10 on Windows 7 is available for download today.”
https://www.scirra.com/blog/103/ie10-review-still-disappointing-for-…
The release of Windows 8 comes with a new Internet Explorer: version 10. A new IE is still a rare event; IE9 was released about 18 months ago, whilst both Chrome and Firefox now release updates every six weeks. We’ve reviewed it from the specific point of view of developing HTML5 games, especially with Construct 2. We’re not taking in to account the interface or general browsing experience, just the technology that powers games. We find IE10 is still behind Chrome and Firefox in both performance and features, which means another frustratingly long wait before any hope of further progress from IE.
We are disappointed to see IE10 still does not support the free Vorbis audio codec, either in Ogg or a WebM container. The web desperately needs a single audio format that works everywhere, and it’s ridiculous that we are still having to dual-encode all audio to both AAC (for IE10 and Safari) and Vorbis (for all others). Vorbis is high-quality, robust, proven, free, and straightforward to implement. With such an obviously needed feature, why are Microsoft resisting it?
Edited 2012-11-14 22:31 UTC
My hope is that Opus will find broad adoption eventually. It is not just an open codec with a great free implementation (in floating- and fixed-point variants), it is also part of the official IETF WebRTC standard. Microsoft was part of the development team (via Skype) and Opus showed to be better than Vorbis, MP3 and AAC in listening tests most of the time. It is said to be the best low bitrate and the best high bitrate lossy codec all in one (well, it does consist of two sub-codecs).
I hope opus support will come as an update to IE 10 (and thus also to Windows 7). But if not I think it should at least be supported by IE 11.
Then those listeners must have been deaf, because right now, Opus is good for nothing except pure spoken word. Try encoding music with it, then try using AAC or HE-AAC (a decent AAC encoder, not FAAC) to encode the same source material. No matter the bitrate (unless you get it too high to matter) any music or other non-voice audio stream encoded via the free Opus encoder sounds like the audio is being filtered through grains of sand. Opus is a superior bitstream to AAC, but at the moment the encoder has zero psychoacoustical models to best offset the artifacts introduced by compression. Opus, at the moment, is much more of a high quality replacement for Speex than a replacement for MP3 and AAC.
If this is so then for applications at the high quality end one should probably still choose Vorbis for use over the web. Vorbis is better than mp3 but not quite as good as AAC.
Vorbis is the choice of the Wikimedia Foundation:
https://wikimediafoundation.org/wiki/Home
Because it is a well-performed, open, royalty-free codec, Vorbis is the natural choice for web audio. After many years, IE still does not support audio on the web properly.
Edited 2012-11-16 01:35 UTC
Every web browser has missing features from whatever standards are in place.
I could go on about all the irritating things that Chrome and Firefox don’t implement properly.
What you fail to understand is that it doesn’t matter, Web developers will use a polyfill or similar to provide the audio functionality.
1) There is a preview Media Foundation plugin for Internet Explorer: http://www.webmproject.org/ie/
2) There is nothing stopping the other browsers from linking to the API’s (QuickTime on OS X, Media Foundation on Windows and gstreamer on *NIX) to obtain h264/AAC playback functionality.
Actually, there is. Web Standards are supposed to be royalty free.
http://www.w3.org/Consortium/Patent-Policy-20040205/
“The W3C Patent Policy governs the handling of patents in the process of producing Web standards. The goal of this policy is to assure that Recommendations produced under this policy can be implemented on a Royalty-Free (RF) basis.”
Having web standards being royalty free prevents anyone setting up a “toll-gate” for access to non-commercial content on the web. Anyone and everyone should be able to host and access (creative commons or public domain) media content over the web without having to pay any royalties to any third parties. Anyone and everyone should be able to build web infrastructure and clients without having to pay royalties for permission to do so. Having it this way is absolutely central to the original intent and purpose of the web in the first place.
Access to the web is defined as a human right:
http://www.wired.com/threatlevel/2011/06/internet-a-human-right/
http://www.globalpost.com/dispatch/news/politics/diplomacy/120706/u…
This doesn’t mean that commercial content is disallowed, it means only that it must be possible to access public content without having to pay royalties.
H264/AAC are not royalty free, and hence they are unsuitable for use as the ONLY media standards over the web. They CAN be used, but they must not be the ONLY means. This is not a problem for web standards since there are other codecs (namely WebM, Vorbis and Opus) which are as good or better performance-wise, and they are truly royalty free, anyone is allowed to implement them, and hence far more suited to be web standards.
Since WebM, Vorbis and Opus are royalty free, and anyone and everyone has full and irrevocable permission to implement them, what exactly is the excuse of Apple and Microsoft for failing to do so?
Edited 2012-11-15 07:40 UTC
As for Opus, it is actually a better audio codec than MP3, Vorbis or AAC.
http://www.opus-codec.org/comparison/
The Opus audio codec has just about everything covered. Since Opus is a totally open, royalty-free, highly versatile audio codec, which can handle a wide range of audio applications, including Voice over IP, videoconferencing, in-game chat, and even remote live music performances, and it can scale from low bit-rate narrowband speech to very high quality stereo music, why exactly wouldn’t Apple and Microsoft want to provide their customers with the best option?
http://www.opus-codec.org/
WebM does take longer to encode than h264 at the same quality level, but that is its only penalty. If one chooses encoding profiles to yield the same quality level, the WebM codec actually ends up with slightly lower bandwidth (iow a slightly smaller filesize) than H264.
Edited 2012-11-15 07:55 UTC
Have a WebM/H264 comparison to verify that?
Here’s a fairly thorough one that puts x264 above WebM’s standard encoder consistently, both in terms of quality and encoding speed (encoding speed seems to be about 3x faster for x264):
http://www.compression.ru/video/codec_comparison/h264_2011/mpeg-4_a…
The comparisons you linked all compare a chosen profile of one codec against another. At some particular profile, one codec will be better than another … but one can simply choose a higher profile for the second codec and it will be the other way around. The second codec at a higher profile will take longer to encode then the first profile chosen, but the resulting video will be better quality for the same filesize.
OK, so one can also choose a higher profile for the first codec, and now once again it will be better, once again at the cost of taking longer to encode. One can get a lot of combinations/comparisons this way, with each codec surpassing the other (in quality per bit) depending on the profile chosen.
There are multiple variables at play. Because the profiles are different for each codec, which profile to compare with which is somewhat an arbitrary choice. If one wants to make a reasonable comparison, one should eliminate at least one of the variables.
OK, so the best way to make a direct comparison between codec performance is what I alluded to in my original post. One chooses profiles for each codec such that they produce the same quality video at the same resolution and bitrate, and THEN one can compare the time to encode and the filesize.
If you do it that way, then WebM takes a lot longer to encode (to the same quality as a given profile of x264), but the resulting file size is a bit smaller.
Having said that, WebM time to encode is getting a lot smaller as the codec matures. The latest release (just over a week ago) is the fifth generation of the libvpx software (codename Eider) and sixth generation of the hardware (G1 decoder “Fairway†and the H1 encoder “Foxtailâ€). The first letter of the codenames indicate the generation since the inaugural release.
http://blog.webmproject.org/
This new Nov 2012 release is probably at least two generations better than the version tested in the PDF file you linked. The gap in encoding speed (to the same quality of resulting video, with slightly smaller filesize) has probably narrowed considerably.
So, I repeat, the only penalty for using WebM is that it takes longer to encode. If the profile you are using for WebM gives you lesser quality than the one you have been using for x264, then simply choose a higher profile for WebM. Admittedly this choice will cost you in terms of the time taken to encode, but that time cost is the only penalty. Then again, it must be remembered that video is normally encoded once for many times that it is replayed, so if there must be one area of compromise, then encoding speed is the best area in which to take the penalty.
Edited 2012-11-16 01:08 UTC
Okay, let’s eliminate a variable: look at page 27, this fixes the quality to “high” for all encoders, let’s assume this is the top for WebM: it uses –good –cpu-used=0 which is listed as an alternative for –best by the WebM encoding guide.
The quality for a given bitrate is always higher for x264. This means for the high-quality settings, for equally sized files, the x264 will be of higher quality (by the Y-SSIM metric they use). You will also spend 1/3 the time waiting for it to encode.
They won’t be equally-sized files, WebM gives slightly smaller filesizes for the same bitrate and resolution. So, in order to match the highest profile of x264, one must increase the bitrate for WebM. If one increases it just enough so that the filesize becomes equal, and the quality is equal, WebM must employ a slightly higher bitrate. In other words, once one exhausts the options of going to a higher profile, one can only increase the quality further by increasing the bitrate. Once again, one can then also increase the bitrate for the other codec as well (but then that increases the filesize), and we get into the exact same revolving door (since now more filesize is available for WebM allowing it to go to an even higher bitrate), where one codec is better then the other, and all the time the time taken to encode (for both codecs) keeps increasing.
I told you there were multiple independent variables. This is evident from the very graphs you keep referencing. Very plainly, if one runs out of “higher profiles”, the way to further increase the quality is to increase the bitrate. Why is this so hard apparently for you to accept?
With WebM one does NOT have to suffer lower quality per bit (i.e. filesize) if one chooses not to, but the penalty that one must pay, as I have said twice now, is encoding time. There is no question that to get the same quality per bit WebM does take longer to encode. However, as I have said, that is the only penalty, and furthermore, as I have already pointed out, this gap in encoding time is reducing as the WebM software and hardware implementations mature. In any event, because videos are only encoded once per many times they are downloaded or viewed, encoding time is the best area of any in which to compromise.
Edited 2012-11-16 10:03 UTC
Am I misunderstanding something: how do you get different filesizes for the same bitrate? Bitrate is bits/time, for the same clip (time) you should get the same filesizes, modulo some header information for the codec.
What those graphs are describing are how the bitrate corresponds to the quality for a given profile. If you take the graphs as a whole they tell you how your quality will increase as you increase the filesize (bitrate) for a given clip, cheers.
http://en.wikipedia.org/wiki/Variable_bitrate
“Variable bitrate (VBR) is a term used in telecommunications and computing that relates to the bitrate used in sound or video encoding. As opposed to constant bitrate (CBR), VBR files vary the amount of output data per time segment. VBR allows a higher bitrate (and therefore more storage space) to be allocated to the more complex segments of media files while less space is allocated to less complex segments. The average of these rates can be calculated to produce an average bitrate for the file.
MP3, WMA, Vorbis, and AAC audio files can optionally be encoded in VBR. Variable bit rate encoding is also commonly used on MPEG-2 video, MPEG-4 Part 2 video (Xvid, DivX, etc.), MPEG-4 Part 10/H.264 video, Theora, Dirac and other video compression formats.”
The better the VBR compression, the higher the average bitrate can be stored within the same filesize.
WebM has better compression for relatively slow-moving video, such as this one:
http://www.youtube.com/watch?v=rLxQiI8c1Bs
If WebM dedicates the same number of bits to such a clip as H264, the still frames captured from the WebM rendered video playback will be sharper and better quality than H264.
In areas and segments of high motion, WebM throws a lot of data away, and when you look at a still frame, whatever part of the picture is moving quickly will be rendered as a blur. H264 will be sharper, but it will exhibit artefacts. This gives WebM a poorer score in objective measures such as PSNR, but since the human eye sees high motion as blur anyway, subjectively it looks fine.
Cheers.
Edited 2012-11-17 07:12 UTC
How can you have a “higher average bitrate” within the same filesize?
It’s like saying you and I both ran a marathon (filesize) in the same amount of time (clip running time), but you went faster.
One average bitrate can’t be higher than the other for a clip of the same length and filesize; they are the same.
As to your other point: so you have no benchmarks (even screenshots?) to support your claim that WebM has higher quality/bitrate than x264? Okie.
Edited 2012-11-17 07:34 UTC
With VBR, AFAIK the bitrate refers to the video as rendered, not the video as compressed. Because WebM static compression quality per bit is better than h264, and because most video is lower motion rather than higher motion, WebM can deliver a higher (as rendered) bitrate from the same number of as-compressed bits.
Where WebM suffers in terms of objective measurements is in areas of high motion. Because WebM (deliberately) blurs these high dynamic areas, so as not to waste too many as-compressed bits, they compare very poorly between the rendered still frames and the original still frames, and cause WebM to score poorly on objective measures such as PSNR, even though to the human viewers eye when watching the video at normal speed, the blurring of high motion areas has little detrimental (objective) effect on the as perceived quality.
Due mostly to the blurring of high-motion video, a real-life WebM video can easily be objectively measured in terms of YSSM and PSNR as being lower quality than an h264 video, yet still preferred objectively by a human viewer watching the video at normal playing speed. In addition, if you take a still of the same frame during a low-motion scene from WebM and H264, the WebM still frame will be distinctly clearer and sharper, but on some other frame during a high-motion scene, the H264 still frame will be far cleaer and sharper than the WebM one.
So the perceived quality and the measured quality can be quite different.
I did have some screenshots of this which illustrated the point very well, but I can no longer find them. Sorry about that.
I am not, BTW, claiming that WebM is better than H264. I am merely claiming that it performs differently, and for the purposes of video over the web, just as well as h264.
Edited 2012-11-17 10:24 UTC
That’s not what bitrate refers to.
The output bitrate is clearly constant: the number of bits per frame is just bit-depth*width*height (*fps to get bitrate). You don’t change the bitdepth and dimensions of your video dynamically, as VBR would imply if it was the “output bitrate” that was being measured.
You wouldn’t want your movie bigger and smaller as the bitrate changed.
WebM does not have higher quality/bit(rate), x264 does, and that’s EXACTLY what those graphs in the study say.
Additionally the author of x264 has a great breakdown of the WebM/VP8 codec, basically the short side is: it’s alright, misses some of the psy features (probably patented, blah) that H264 has.
http://x264dev.multimedia.cx/archives/377
This guy knows his stuff; even implemented a VP8 encoder, so I think his bias, if any, is pretty non-existant.
According to your definition, VBR doesn’t exist, and it is not possible to have a video at a certain output bit-depth*width*height*fps with fewer bits.
Your definition is clearly nonsense. Variable Bitrate means the bitrate varies (even though the output bit-depth*width*height*fps does not).
Now if I only need a certain bitrate to get a certain quality for part of a video, and I actually use more bits, then am am using a higher bitrate, and I get better quality as a result. That is how it works, chum. The very graphs you keep referencing say so.
Edited 2012-11-18 07:42 UTC
No, my definition allows VBR to exist, but it’s clearly not used for the output bitrate, just the compression bitrate. You said that the bitrate refers to the “video as rendered”, my explanation was to inform you as to why that couldn’t be the case.
You’re repeating something correct in the second half of your reply: that x264 has a higher quality/bit/second than WebM. I’m glad you got to the correct conclusion eventually.
BTW, I don’t argue that WebM is better than x264, because it simply isn’t. There is a penalty to pay (somewhere) if you use WebM.
What I do say is that quality is NOT necessarily where one has to pay the penalty. It is always possible instead to pay the penalty in time to encode.
What I also claim is that, given the objective of, and indeed the fundamental RIGHT to, an open web, it is by far preferable to pay the penalty for using WebM (especially if one chooses time to encode as the currency of that penalty) than it is to pay the penalty for using H264/AAC (which is the encumbrances of royalties to be paid, and restrictions to open competition).
Yes and we’d all love to live in a world where unicorns crap jelly beans, all the southern states have gay marriage and I have a boyfriend but unfortunately we live in reality and what we have is the best of the worse. There is a reason why h264/AAC is chosen and all the conspiracy theories in the world won’t change the fact that it is chosen because it is best fit for the mixture of narrow and wide band connections that exist – from broadband mobile through to cable internet, from ADSL2+ to fibre optic.
Secondly, I provided a link to the WebM plugin for Media Foundation which would simply be an extra download for someone wanting to use said format. When it comes to Apple, why should they step out of what is pretty much an industry standard? pretty much all the large companies have agreed on it so why go against the grain? what’s in it for them?
I have the plug-in for IE — which works just fine with IE 10.
But finding a WebM video in the wild is quite the challenge. There doesn’t seem to be anything out there but a transcode for YouTube.
There seems to be no such thing as a hardware product — amateur or pro — that supports WebM natively. While H.264 hardware is available for every video application you could name.
The final problem is HEVC. The next-generation proprietary codec. There are huge potential savings in bandwidth here for all users and video providers.
I beg to differ.
http://blog.webmproject.org/
Under the heading: “Sixth Generation VP8 Hardware Accelerators Released”
I quote: “The VP8 hardware cores have now been licensed to over 80 chip companies, and both the decoder and encoder are in mass production from a number of partners.”
Every single Android device since Android 2.3 Gingerbread supports VP8 & WebM. Every single Android device since Android 4 Ice Cream Sandwich supports WebM decoding in hardware.
Android activations now number over 1.3 million per day. There are now over 500 million Android devices (not all of them are Android 2.3 or better, I grant you, but most of them are).
http://androidandme.com/2012/09/news/android-activations-now-at-1-3…
Every new video uploaded to Youtube is encoded to WebM. Over 99% of videos viewed can be viewed as WebM.
http://www.engadget.com/2011/04/20/youtube-starts-transcoding-all-n…
Google’s entire infrastructure, obviously, supports WebM. If one chooses HTML5/WebM for YouTube, one now effectively suffers no penalty for doing so.
Ubuntu is heading towards 9% of the PC market. Ubuntu supports WebM.
http://www.thevarguy.com/2012/11/06/open-source-ubuntu-os-makes-str…
You are a long, long way behind the times.
Edited 2012-11-16 10:50 UTC
Ha! We might end up with gay marriage at some point. The view of “gays” down here is getting pretty good believe it or not and the older generations are dying off.
I live in Alabama and couldn’t give a shit, so I always vote “yes” to anything relating to it. Because the limitation is so pointless and ridiculous.
It’s not a matter of quality, it’s a matter of inertia.
H.264 is around because the various companies that were involved in the matter decided to support it at a time where the only alternative was Theora (which, indeed, made sense). And AAC went along the way because it is comes for free with H.264 video support.
Changing to any other codec will be painful now, because no one took the time to make a proper codec-agnostic video decoding infrastructure in web browsers and SoCs. Hacking away hard-coded support for one codec is simply faster and cheaper. Thus, I am ready to bet that by the time HEVC is around, even if it is as good as the MPEG-LA claims it to be, it will encounter exactly the same issues as WebM today.
If audio and video quality was truly an issue, everyone would be using Vimeo over Youtube
Edited 2012-11-16 07:14 UTC
http://ontwik.com/html5-2/internet-explorer-the-story-so-far/
If you actually watch this, you will realise why they are always behind. Microsoft’s Enterprise customers like sticking to a version number so they can test their apps against it and like the stability.
What these devs don’t tell you or don’t know is that that they release a platform preview every thing 6-8 weeks so they can test more features and possibly get more features in before the cut off.
While it isn’t Ideal, Microsoft have their reasons for their release schedule.
Also while IE9’s and IE10’s JS engines aren’t as fast as the competition … they are a hell of a lot better than IE7 & 8.
Edited 2012-11-15 08:24 UTC
This is akin to saying that the current worst-of-the-bunch is a lot better than the previous worst-of-the-bunch. It holds for not only the JS engines, but also for the rest of the browser.
It isn’t much to crow about, is it? With the first release of IE6, Microsoft had the best browser client of any. Microsoft had clearly lost that lead by October 2006 with the release of Firefox 2.
http://en.wikipedia.org/wiki/History_of_Firefox#Version_2
They have been a long way behind other leading browsers ever since.
This is clearly reflected in the market share trends:
http://gs.statcounter.com/?PHPSESSID=j2juf5bil673j4vrso39eijui6#bro…
IE10 doesn’t appear to be the version of IE that can arrest this persistent slide, especially if it is the worst performed modern browser, it provides the least features, and it misses out on basic things like open codec support that could easily have been provided at next-to-zero cost.
Edited 2012-11-15 09:22 UTC
While you may scoff, we now have all major browsers with decent JS engines, that is a massive win for all web developers.
Who cares, that is now ancient history.
Not anymore.
IE 9 is a pretty fast browsers and are a good default browser to the world’s best selling desktop operating system, Windows 7.
Nobody really uses browsers because they do better in the Sunspider benchmark.
http://news.softpedia.com/news/IE9-Usage-Share-Bigger-than-Those-of…
Err … no.
There are reasons why Microsoft moves at a slower rate with IE as I have already pointed out in my previous comment.
TBH, open codec support is pretty minor, with the majority of the web still using Flash Video.
Major HTML 5 features and the fact that it can now do Strict mode for ECMAScript, is far more important.
Being a Web-developer, these are the features we want … I have no interest in some idealistic fight over a video codec.
I have written my recommendations in my blog post here
http://luke-robbins.co.uk/video-on-the-web/
And nothing has changed.
If Web Developers want IE to have better standards support, they should use the platform previews and give feedback to Microsoft (via bug reports).
If you want to troll, actually try to do better than quoting hipster web developers … which will hate on IE no matter what.
Edited 2012-11-15 11:20 UTC
Bug reports and other requests for standards support are ignored for years by IE developers. There have been incessant requests for support for SVG and PNG that were ignored by the IE team for over a decade, and bug reports which alluded to the lack of support for standards were simply marked as “will not fix” or “will not implement”. Vorbis format was standardised in 2003 and in 2006 it was proposed as a web standard for audio, yet it is still not implemented in IE to this very day. IE is no less than six years behind the times.
Instead, Microsoft pushed proprietary methods such as WMV and WMA, ActiveX and Silverlight, in a transparent attempt to make it all-but-mandatory to use IE (and hence Windows) as the only way to access the “full, rich-content” web. “Rich content” is Microsoft’s own term, BTW. Such attempts should rightly be vehemently opposed by all fair-minded people.
The very saddest thing in all this is that you apparently think I am trolling in this. Let me assure you I am not. Microsoft’s attempts to make the web a walled garden for access only by using Microsoft products as clients is amongst the very worst of their monopolistic anti-trust behaviour, and it has been so for decades. Even a simple-minded fool knows this to be so.
Edited 2012-11-15 11:59 UTC
Maybe a decade ago, but if you actually watched the video I have linked, they are actually trying to get developers to contribute.
Also they have good reasons for not implementing some standards. Unlike other browsers development teams (cough cough Chrome) they don’t implement features when badly for the sake of it after IE6 (IE6 worked with the draft standard at the time, and was changed signicantly afterwards).
No it isn’t, I could probably point to a lot of other features that have just turned up in browsers in the last few years (font-face has been present since IE 4.0 I believe).
Picking one possible web standard and saying IE is behind is ridiculous.
Also IE8 was the first browser to implement CSS 2.1 and XHTML 1.1 correctly.
In the past they have yes, pushed stuff like this. I agree it is shitty. However that is simply not the case anymore.
If you actually watch the video they are actually listening to developers and know they simply cannot do that anymore.
There is a massive push in the .NET community to support web-standards as well, coming from Microsoft.
Since I actually work in this industry and use Microsoft Products to make web applications that are conform to standards (I actually really care about web standards). Before and .NET 3.0 it was very difficult to do things that would conform to standards … now it is easy.
It doesn’t matter what I say on the subject, you will doggedly keep the same opinion no matter what.
So you are either trolling, or you are delibrately being ignorant. I don’t care which.
I actually work in this industry. I have seen a massive change in the way Microsoft has been pushing its ASP.NET web stack since 2008. A lot of it is inspired by the Ruby and Python communities.
Lets not forget they made the whole MVC stack Open Source and they even now have a package manager that makes it easy to use Open Source projects in your web application.
All you are doing is simply repeating criticisms that are almost a decade are and simply aren’t true anymore.
Edited 2012-11-15 12:53 UTC
“Bug reports and other requests for standards support are ignored for years by IE developers.”
Anyone that works in software gets used to this, and it isn’t MS-specific. There just aren’t enough hours or brainwaves in the day to fix everything. You triage, and sometimes you ignore. Sad but true.
Several years ago I encountered a rendering bug in FF, that when I reported it was stunned to find it was already almost ten years old (probably better than 15 years now), persisting from early Netscape days. It had been actively reported and updated the whole time. Some helpful users had even submitted patches. I bet it is still there.
Edited 2012-11-15 21:28 UTC
I hope MS has made it easier to add a trusted page in IE 10. IE’s security zones are actually decent if you bother to use them, but the process of adding a trusted page is a little click-intensive; users could do with a “Trust this page” menu entry somewhere.
Also it would be cool if IE could be set to auto-refresh trusted pages as they were added…
Even if they didn’t add this stuff though, it’s kind of amazing how far IE has come since the days of Windows XP.
“We designed and built IE10 to be the best way to experience the Web on Windows”
It sounds so cheesy …
Seriously, this is just downright pathetic. I mean… come on. Microsoft has 100% control over both Windows and IE. Research, development, design decisions, testing, advertising, marketing, sales, licensing, source code, and the product in general. Windows 8 came out with IE10, what, over half a fucking month ago? And still, they’re dicking around trying to get IE10 running on Windows 7? WTF?!?
Meanwhile, Windows is regarded as being (and I won’t deny, it is) the operating system with the absolute highest level of binary backwards compatibility not just from release to release (ie. Win98 to WinXP, Win7 to Win8), but also from many versions back (Win9x-XP, XP-7, hell… even Win9x to Win7). Honestly, in that aspect, nothing comes close; Mac OS X is an absolute joke, and Linux/BSD are almost fully based on open source software so binary compatibility may not be great, but because the source code is available and thanks to software repositories it doesn’t really need to be.
And here we are, looking at Microsoft struggling to fucking backport the current latest version of one single god damn application to their previous (as in, still commercially supported) version of Windows. And yet… they own the source code to both the OS and the web browser. Hello?!? Microsoft, is anyone there?!
Am I the only one who smells something extremely fishy with this? Something as in, “we want you to upgrade to the latest version of Windows to get the most recent Internet Explorer, so… uh, yeah! That means we can just sit on our asses at home all day, kind of like what we did with IE6 as we held the web back for years and let it sit to rot!”
Shit, even Firefox (Iceweasel) 10.0.10 ESR is in Debian Backports for Squeeze, an OS released in February of last year, and Debian doesn’t even develop the browser! Come on Microsoft, get off your asses, there are people out there paying some damn good money for your crap. Quit using the development builds of your browser to watch porn and just get it finalized and release something for once.