Mozilla has released Firefox 7. Unlike releases of Firefox 5 and Firefox 6 which were relatively minor upgrades to the browser, Firefox 7 includes a number of significant improvements, most important of which is probably the drastically reduced memory usage.
…yet another f***ing inflated version number.
And Chrome inflating itself to version 14 (and updating itself far more often) is surely “twice as worse” than Firefox is? It is interesting to note how Mozilla is beginning to “hide” the version numbers of its products – the download buttons no longer reveal the version and if they could remove it from the “About…” box, I bet they would!
However, although they’ve been copying some of Chrome’s features recently (and not all the copied Chrome ideas have been good ones, mind you), Mozilla has yet to copy the idea of “silent updating” that Chrome uses. This is why you see people whine incessantly about new Firefox releases (normally every 6-8 weeks unless there’s an emergency fix) and no-one complains about the many more updates Chrome gets (every 2 weeks typically).
Also don’t forget that IE on Windows often gets a monthly update as part of the “Patch Tuesday” – again, more often than Firefox, but no-one complains about that either (mainly because MS don’t bother changing the major version number and it’s “bundled” with a bunch other updates, which are often all automatically applied).
So, yes, Firefox increases its major version number at about the same rate as Chrome, but like Chrome, just learn to ignore the version number really. I think Mozilla should have done a year.release version scheme (e.g. 2011.01 for the first release of the year and so on – gives them 99 releases [which could include alphas and betas inbetween stable releases] a year, which should be more than enough). That way, no-one would moan about the major version changing because that would only happen at least 12 months apart!
Bingo.
Version numbers are retarded anyway when it comes to browsers.
Version numbers are retarded when it comes to consumer software that doesn’t have any platform characteristics.
Browser have extensions, but maybe the solution is to either promise backward compatibility or use a distinct version number for the “extension platform” visible only to developers and power users.
Anyway, I still see the need for version numbers when it comes to libc and linux images.
They plan to. But the version of course will not be gone completely. It still will be shown in about:support (Help > Troubleshooting Information in the menu).
Edited 2011-09-27 20:41 UTC
But that means that they would go from 7 to 2011. Inflation of almost 3000%!
Thanks for the advice.
Who cares. Memory reduction is very useful. Thanks Mozilla.
I have here an 8Gb, Win7 64, quad Phenom box and have it installed Firefox 7.0. As a Google Chrome user, I tested firefox right away, doing my usual browsing habits opening tabs, typing a search query, then it suddenly very slow, I can’t even click the tabs and I have to wait for 10 to 15 secs. I have never experience this in Google Chrome, unless there are too many tabs I may experience slowness.
Forget about the memory reduction, it ain’t useful when it will compromise stability. Stability is more important to users. Now I’m back again at Google Chrome. When time is wasted, it is turn off.
I have no idea what causes it, but I get similar experience: whenever I fire up Firefox for the first time it’s useable for a few seconds, then completely locks up for about 30 seconds during which it won’t respond to anything. After those 30 seconds though it doesn’t do that again.
It’s a nuisance, but since I only fire up the browser like once a day it’s not that big deal for me. I can imagine it being more of that for people who needs to restart browser/PC often, though.
Try increasing the parameter browser.sessionstore.interval under about:config.
http://www.techmynd.com/firefox-freezes-for-a-moment-after-every-10…
On some systems the default value for this parameter (10 seconds) causes some problems. The value is stored in milliseconds, so perhaps try a value of 60000 (60 seconds). This might solve the issue for you.
Edited 2011-09-28 07:26 UTC
You didn’t read completely what I said: it doesn’t freeze up periodically, it freezes up only when I first launch it and stays frozen for up to 30 seconds, after which it works fine until I restart it.
WereCatf,
This happens every time you start FF (and not just once per day/session)?
Is there any disk/network/cpu activity happening in the meantime?
If CPU activity is normal, then it implies firefox is waiting for IO for some reason, you could try “strace firefox” to see exactly what it’s trying to do. While it’s hung, check if moving the mouse causes spurious events. If yes, then FF is receiving input but is not acting on it.
Given your problem, you will see FF blocked on one single call for >20s, or you will see it looping over a sequence of calls for >20s.
For me, while it’s running normally, I see this over and over again.
gettimeofday({1317197393, 220526}, NULL) = 0
read(4, 0xb7384058, 4096) = -1 EAGAIN (Resource temporarily unavailable)
gettimeofday({1317197393, 220613}, NULL) = 0
poll([{fd=5, events=POLLIN}, {fd=4, events=POLLIN}, {fd=9, events=POLLIN|POLLPRI}, {fd=19, events=POLLIN}, {fd=20, events=POLLIN}], 5, 0) = 0 (Timeout)
The “read(4…)” is FF checking the X-server for input. It returns EAGAIN because this is a non-blocking socket with no input.
The “poll(…)” is FF checking if any of it’s descriptors have input (I have no idea what these descriptors are, other than 4).
Dev Commentary: The read call is totally redundant with the poll call and should only be triggered by the poll. Furthermore I’m not keen on how FF is constantly timing out while it’s not doing anything useful, ideally it would schedule a background timer instead of maintaining it’s own clock as it appears to be doing with “gettimeofday”.
It occurred to me you may not be using linux…oh well.
Did you try it?
As the name browser.sessionstore.interval implies, this parameter is the interval at which Firefox tries to save the current session, for the purposes of a possible future session restore.
Session restore itself only occurs when you start the browser. If, on starting (on your machine) Firefox is not fast enough to restore the ‘as saved’ session in the period of the browser.sessionstore.interval, then Firefox (only on startup) will be trying to save and restore a session at the same time. Only on startup.
Edited 2011-09-28 09:17 UTC
PS: Another thing which you can easily do to reduce the startup time on slow systems, and thereby avoid similar freezes, is to clear the browser cache and then drastically reduce its size. On my under-powered netbook, I cleared the browser cache and reduced it from the default of 1024 MB to just 100 MB. This resulted in great improvement in startup.
Edited 2011-09-28 10:35 UTC
I changed the value to 30 seconds and it seems to be working.
That is some serious brainfart, even a moderately experienced coder should be totally ashamed of himself. It’s not hard to enable session saving AFTER a session has been restored..
Who cares? It’s just the version number, a completely arbitrary indicator of that “here’s a new version,some things have changed”.
Edited 2011-09-27 20:57 UTC
No, there was a time you could look at the version number and tell if there were some major changes or not. You know, things like major compatibility and UI changes. Firefox’s version number has only recently become “completely arbitrary”… since, what, Firefox 4, 5? Never mind the whole incompatible extensions mess that is brought to extreme levels with these rapid version number inflations.
Edited 2011-09-27 22:48 UTC
V1.5 was more like a V1.1 – V3 was more like a V4 in comparison with V2 – V3.5 was meant to be V3.1 but became too big – V3.6 was more like a service pack to V3.5 so maybe make V3.5 a V4 and V3.6 a V4.1 – V5 should be V4.1 and V6 should be V4.2 but V7 should be V5.
Should we really use version numbers as an indication of how many changes were made? Do you think 3.6 broke less extensions than 5?
Firefox version number has nothing to do with compatibility changes.
Mozilla has been desperately asking authors of extensions to avoid inspecting the version number to try to infer anything about compatibility.
In practice, it does for users: extensions stop working or worse, they are deactivated by FF without the warning we were given with FF 4. I’ve seen this with the Java Console, the .NET thingy and other plugins. Now, because we have no obvious way to get rid of those system-wide extensions, they sit there in the extensions list.
Extensions that stop working are the most annoying thing about FF.
Mozilla should have come up with a better way for extensions developers to do what they think they’re doing by testing version numbers. Yeah, I know it’s very easy to say in retrospect. If the stubborn devs keep doing what they are now told not do do, it’s because they’ve been given the means to do it in the first place.
Run Firefox as an administrator.
http://www.pcworld.com/article/213114/remove_old_versions_of_the_ja…
https://addons.mozilla.org/en-US/firefox/addon/add-on-compatibility-…
https://addons.mozilla.org/en-US/firefox/compatibility/reporter
Edited 2011-09-28 11:40 UTC
Many thanks.
No, it’s because they’re either too lazy or not good enough.
That’s what release notes are for. It’s not like the version number will tell you exactly what changes there are and what problems you can expect.
I guess the extension authors have to write better code then.
Yes, but it’s nice when the version number gives you a little hint as to what to expect.
UZ64 said: …yet another f***ing inflated version number.
Well thanks for yet another really useful and insightful comment. You must put a great deal of time and effort into your analysis.
Regards,
Machiavelli
I logged out of Google.com in Firefox and Chrome.
Closed all windows and unpinned all tabs, went to google.com and closed the browsers.
Fired them back up and Chrome is using 48M while Firefox is using 85M.
The real benchmark is what the memory footprint will look like after real use and leaving it up for a while.
It’s the features added by extensions that keep me faithful to Firefox. I don’t have the solution, but it’s a real problem. With every upgrade one or more addons that I use is disabled and makes the new improved version of Firefox feel like a step backward.
Edited 2011-09-27 20:55 UTC
I’ve never had any problem, except for the Australian dictionary. I simply used the GB dictionary instead. In addition for one update, one of my addons wasn’t available for a week or so.
The significant majority of the addons hosted on addons.mozilla.org should work after an update, since Mozilla automatically verifies them for compatibility.
Just let them update.
It’s nice that it Works For You, but that’s no answer to the problem of what to do when it doesn’t work.
My experience is like the grandparent: Every FF update breaks at least one extension that I use daily. It’s a good thing that I can control when I update so that I can delay upgrading until all my extensions have been updated. Oh, wait! Apparently FF updates never break anything, so Mozilla is going to make it all transparent, automatic and impossible to opt out of. Well that’s good news; it’s not as if I ever have to worry about testing my web sites on older versions of the engine… oh, wait.
You get the idea.
Addons are actually compatible 99% of the time. Addons hosted at Mozilla’s site, addons.mozilla.org, are automatically tested and updated by Mozilla. One would be quite unlucky to have one of the 1% of addons (from addons.mozilla.org) that was not automatically compatible after an update.
So the “problem” is this … the majority of addons, especially with Windows users, don’t actually come from addons.mozilla.org. Up to 75% of addons in use are outside of Mozilla’s direct influence, Mozilla can’t do anything about them. Currently Firefox takes the approach of extreme caution and disables these “external to Mozilla” addons, it doesn’t just assume they will still work with a new version of the browser unless the addons specifically indicate that they will. People have to update these “external to Mozilla” addons seperately.
Instead of doing that, people just jump on Internet forums and complain that “Firefox update broke my addon”. Sigh!
OK, so what to do about it. There are basically three options:
1. First and foremost, go the source (external to addons.mozilla.org) of the addon which has been disabled after a Firefox update, and see if there is a corresponding update for the addon. If there is, download it and install it (by opening the .xpi file in Firefox). Sweet.
2. If the source of the addon hasn’t updated it, it is still quite likely that the old version of the addon will still work, and that Firefox has been overly cautious in marking it as disabled. To override this, install an addon from addons.mozilla.org called the “Addon Compatibility Reporter”. Within this addon is the ability to enable older addons that Firefox has disabled after an update. So use the Addon Compatibility Reporter to mark the old disabled addon as enabled once again, and try it. In the vast majority of cases (perhaps up to 99% of cases), the old addon will still work with the new Firefox version.
3. If the old addon doesn’t work, and the original source of the old addon won’t be updating it, then have a look around. There are literally many thousands of addons available for Firefox. It is quite likely that with a bit of effort you can find a suitable replacement.
Hope this helps.
Edited 2011-09-29 23:12 UTC
Mozilla can do a lot about them, it doesn’t have to control them to not break them.
Which is a problem with, and a bug in, Firefox.
No kidding. Of course, the third party addons are not necessarily updated as soon as the new release of Firefox is out… which means that people who use them definitely don’t want automatic updates. But, it’s OK! 99% of a subset of 25% of addons won’t break, so automatic updates are OK.
In fact, Firefox did break my addon. The facts are these:
1) Firefox refuses to run addons which don’t specifically advertise themselves as compatible by explicit version range.
2) Mozilla does not provide a mechanism for addon developers to specify or check for features on which they rely.
3) Mozilla does not provide any guarantees of API stability between major versions.
4) All versions are now major versions
Why is there no message like this after an upgrade?
Firefox doens’t do this; that’s Firefox’s fault, not the extension author’s.
Why doesn’t Firefox provide a “or newer” version-check option? “My extension works with 4.0 or newer” – meaning newer Firefox versions assume it’s compatible. Yes, in theory it could break on some unknown future version… which is why it would be nice if you could somehow limit how long “or newer” is assumed to be compatible (like, say, by claiming “4.x through 5.x” and then having Mozilla only bump the major version number once a year) but apparently that’s just not cool enough. An extension could break in the very next release, even though to claims “or newer”–it’s impossible to know for sure, but the consequences of this are far less horrendous than “break”ing it *for sure* by disabling it.
Firefox could provide a simple mechanism to force-enable extensions which were disabled due to failing the version compatibility check. No such mechanism is provided by default (ridiculous, convoluted steps are required to make this work). A crash handler which knows about force-enabled extensions and knows to re-disable those first when a crash occurs would also be nice.
Stop treating users like children. Most third party addons that I use provide a repository, just like Mozilla does. I already check those which don’t. The problem isn’t “The user is too dumb to check for updates and blames Firefox for it.” The problem is “The addon doesn’t have a newer version” and the user can’t opt-out of upgrades and the user doesn’t necessarily know how to extract an XPI and bump the claimed compatibility number himself and the user doesn’t have any reasonable GUI way to do this.
A thing for which I blame Firefox (and Mozilla), hence “Firefox broke my addon.”
What earthly reason is there for not providing this addon by default, or (since it’s overkill for most users) some subset of its features? See my various rants above.
If this is the case why was the decision ever made to default to assuming that it won’t work? Why wasn’t that decision reversed at least at the time of the release of Firefox 4.0? It could have been changed, it wasn’t, and I blame Mozilla.
There are thousands covering common uses but very few covering niches. Actual cases where the addons aren’t compatible and aren’t being updated are so rare that I don’t personally care, but it’s a case that justifies not auto-updating Firefox for some users.
Telling me things I already know and trotting out tired excuses are not things that are likely to be of any help (and weren’t).
sorpigal,
I totally feel your frustration. Ever since FF3, this has been an ongoing issue – I lost the use of several of my favorite developer addons. Of course at that time, there may have been a genuine incompatibility, rather than an incompatibility caused by mismatched version numbers.
Some might blame the authors for failing to update version numbers in app metadata for each new FF release, but like you, I feel this was a fundamental software engineering problem which FF introduced itself.
Just imagine if other architectures choose to use version numbers for forward compatibility testing instead of checking for compatible interfaces: Java apps which run only on specific JVM versions. Windows apps which can’t run on newer windows. Linux binaries which break every new kernel release.
Hopefully FF10 will finally fix this, but it shouldn’t have taken so long.
“What earthly reason is there for not providing this addon by default, or (since it’s overkill for most users) some subset of its features? See my various rants above.”
As far as I can tell, “Addon Compatibility Reporter” does not enable disabled addons, it merely enables you to report information back to mozilla. However, I discovered that “Nightly Tester Tools” does enable a forced compatibility override. It you try it, let us know if it works on versions prior to FF10.
Got fed up with Firefox because of this, and recently switched browsers to Lunascape. It’s switchable on the fly between 3 engines, “Firefox’s Gecko, Chrome’s Webkit, and IE’s Trident engine”. It also uses Firefox extensions with no problems, and no breaking compatibility every week. Also very configurable.
Too bad there isn’t a Linux version…
Yeah, I’m with you on that one. Using Midori in Linux at the moment.
Firefox’s frequent release schedule is great for bloggers. If you can’t think of anything interesting to write about there is always a new firefox release to highlight!
I don’t think so. On Windows 7 32-bit, with only one tab open (this very one I’m using to write this), firefox is taking 140 mb of memory. This is with add-ons disabled. It’s ridiculous and, while Firefox will probably remain my primary browser until I can get my add-ons on other browsers (Chrome is the closest), Chrome’s memory usage with add-ons enabled generally holds at about 30 mb with one tab open. Mozilla’s got a long way to go before they can start claiming they’ve drastically reduced memory consumption. Linux or Mac users, what’ve your experiences been with ff7 in this area?
187MB, with only OSNews open. Linux 2.6.32.34 x86.
Here, Firefox 4 uses 111 MB on Linux 2.6.35.14 x64, with two tabs opened and French and GB dictionaries as the only extensions. I do not count Flash’s memory usage, which is conveniently associated to other processes.
I couldn’t update to 5 and 6 because they consider the latest stable Linux Flash Player (10.3.183.10) to be outdated and automatically disable it without a clear way of getting them back. Guess it will be the same with 7…
Also, I wonder if those talking about Chrome’s memory usage take into account all processes.
Edited 2011-09-28 05:31 UTC
http://venturebeat.com/2011/09/27/firefox-memory/
Open a few tabs and leave them open, and see what happens.
http://lifehacker.com/5844150/browser-speed-tests-firefox-7-chrome-…
Edited 2011-09-27 23:56 UTC
I typically have many, many tabs open. Firefox 7 (although I am on 8 now) uses drastically less memory than Chrome for the same number of tabs.
The use case of “one tab open” is really not interesting to me. Is it really to others? I guess I do not really know.
Important fix for FF 7: http://bzratfink.wordpress.com/2011/09/27/getting-the-http-back-in-…
http:// being gone is really no big deal. Many people don’t know what it means, it it cleans up the interface for them.
Also. https://, ftp://, and whatever else still show.
If it wasn’t for all the CA and SSL/TLS protocol problems I would say: everything should be on https://
Lennie,
“If it wasn’t for all the CA and SSL/TLS protocol problems I would say: everything should be on https://“
Yea, the web should probably default to a low grade encryption, which is extremely fast but deters casual snooping, censorship firewalls, and Phorm-like ISP monitoring.
However, there is one major technical obstacle with SSL that would make this impossible today. HTTPS is not compatible with shared hosting. There are proposed solutions, but they are hacks which leak information and are not supported by today’s browsers.
Standard SSL is IP/port based and is not aware of the underlying HTTP protocol, which leads to a chicken/egg problem. HTTPS needs to transfer the certificate before knowing which which domain the client is trying to reach. Therefor, as is, all HTTPS websites would need dedicated IP addresses.
This is a stupid limitation, however I suspect it’s due to the fact that SSL was invented a year or so before HTTP/1.1, and all websites needed a dedicated IP address anyways.
The internet is shrouded in legacy designs which dictate how things must be engineered today to work around them. I do wonder if we’ll ever get the opportunity to make a clean break?
Low grade encryption ? I don’t think I’ve ever seen anyone think that is a good idea. Most people think that just gives a false sense of security.
Could you explain what is wrong with Server Name Indication (SNI) for HTTPS ? Do you think it is wrong to send the website name in the clear ?
If 1 IP-address == 1 HTTPS website then the website people are visiting is easily identified anyway.
It is supported by all clients except for: IE and Safari on Windows XP and the Android developers messed up and didn’t add it to Android 2.
Everything else (clients and servers) already supports it for a couple of years. And yes it will take an other few years before Windows XP is dead, but by that time Android 2 is also dead.
Could be IPv6 is widely deployed by then and IPv6-addresses for servers shouldn’t be so hard to find. 😉
Lennie,
“Low grade encryption ? I don’t think I’ve ever seen anyone think that is a good idea. Most people think that just gives a false sense of security.”
Well, I’m just thinking that a simple encryption would prevent most instances of trivial http data snooping/injection/etc (I’m thinking Phorm or coffee shop/hotel script kiddies). It would be intended to replace clear text HTTP, it would not be intended to replace HTTPS. However you are right that there would be a risk of it being misused.
Unfortunately CPU overhead is often cited as a reason we cannot use HTTPS for everything. CPUs are a lot faster than they used to be, but now we’re downloading hundreds of megabytes in a few minutes with a many more users per host. Strong encryption is expensive, and for the bulk of HTTP traffic (like osnews), it’s overkill.
“Could you explain what is wrong with Server Name Indication (SNI) for HTTPS?”
I had read that browser support for it is sporadic. That no IE versions on XP support it is fairly major. Also, I don’t think most web providers support it either (even if apache does).
“Do you think it is wrong to send the website name in the clear ?”
I have a tough time making up my mind on this one. Yea, it is a shortcoming since it enables things like censorship through deep packet inspection. But then the only way to hide the destination is for the certificate to be transferred out of band (DNS comes up again). I think ideally, we shouldn’t leak anything other than the ip/port.
(Just thinking off the cuff, but an alternative idea would be to request a generic certificate up front, and then switch HTTPS certificates mid stream secretly).
“Could be IPv6 is widely deployed by then and IPv6-addresses for servers shouldn’t be so hard to find. ;-)”
I think the IPv6 transition will be extremely painful for those of us who still enjoy peer to peer connectivity.
Encryption without knowing who you are talking to is pretty much useless. It is like a self-signed certificate on a random website signed by someone you don’t know.
This is because a man-in-the-middle attack (which can easily be done on most LAN or WLAN) is possible, he/she can present you with an other (even similair looking) self-signed certificate. It can even be generated on the fly.
Setting up a new HTTPS connection is the slow part. CPU definitely hasn’t been a problem in a long time. On a busy server it is more likely there isn’t enough ‘random data’ to able to generate new sessions. So large websites or websites under a (D)DOS attack might need special hardware for that.
Currently we leak the IP/port and it point directly to that one site too. SNI is not less secure in that sense, more secure would have been nice. But it is not backwardcompatible, SNI is.
Lennie,
“Encryption without knowing who you are talking to is pretty much useless. It is like a self-signed certificate on a random website signed by someone you don’t know.”
One nitpick with your statement about anonymous encryption being useless: Unencrypted IP traffic can be altered by a remote attacker who, while blind to your outgoing traffic, can anticipate your requests and spoof the server’s response before it responds. Anyone with sufficiently low latency and egress filters disabled can pull it off, but only on insecure channels.
Anyways, this wasn’t my point. You had said all web traffic should be encrypted, and I was just replying that low grade encryption would be sufficient for most of the traffic that is sent in the clear today anyways. Military grade encryption is simply overkill. (Did you disagree with this assessment?)
“Setting up a new HTTPS connection is the slow part. CPU definitely hasn’t been a problem in a long time. On a busy server it is more likely there isn’t enough ‘random data’ to able to generate new sessions…”
But that’s only the case for dedicated hosting. Many/most websites run on shared/virtual hosts which are notoriously over subscribed to deliberately saturate the CPU. Under a strong encryption-everywhere scheme, we’re likely going to need more horsepower.
“…But it is not backwardcompatible, SNI is.”
If we isolate all the suboptimal software engineering solutions we see today, there’s a good chance backwards compatibility is the reason behind them. I sometimes daydream about what we might change if we could from scratch – what a hippy I am.
Let’s start at the beginning. With every change people want to make in software you have to ask, what are you trying to solve.
Do you want this ‘low grade encryption’ because you are trying to solve that connecting to a HTTPS-site is slower than connecting to a HTTP-site ?
I think your proposal of ‘low grade encryption’ does not solve that. Current CPUs don’t have any problems handling encryption.
The killer for the connection speed is the back-and-forth (and wait) in the SSL/TLS-protocol which is needed to authenticate ‘who you are talking to’.
Why is waiting a problem ? Because of latency. It takes time for a packet on the network to traffel from for example Europe to the US and back. It does not matter how much bandwidth you have. The best we have currently is Fibre: http://en.wikipedia.org/wiki/Latency_%28engineering%29#Fibr…
A similar problem exists in TCP which is used by HTTP and HTTPS.
Here is a good explanation of that problem:
http://mike.bailey.net.au/2010/07/tcp-and-the-lower-bound-of-web-pe…
That is why SPDY works so well, because it multiplexes the different requests over the same TCP-connection.
Here you can see how many messages it takes, with each message the otherside is waiting for an answer:
http://en.wikipedia.org/wiki/Transport_Layer_Security#Simple_TLS_ha…
And there really isn’t an easy way to defeat that the algorithms involved are many, many require it and we need to be able to replace them if it turns out some algorithm isn’t secure anymore.
The best connection speed improvement so far is the False-Start that I mentioned before.
If you want to get rid of some the authentication, the ‘who am I talking to’, you are talking about something like:
http://en.wikipedia.org/wiki/Opportunistic_encryption
But most of those protocols don’t speed up anything either because: 1. they try to stay compatible with existing technology; 2. because they still want to “negotiate” a shared secret otherwise it isn’t secure. And this means you need a back-and-forth (and thus waiting-for-reply).
As I’m not a cryptographer there is really one currently used algorithm which I think is the easiest to understand:
http://en.wikipedia.org/wiki/Diffie_Hellman
As you can see from the diagram alone, it depends on getting a reply before the other party can send a new message.
But if it was the only problem, life would be to easy.
If these kinds of issues interest you. There is more ! 😉 The problem is you are not the only person on one single network, are you ? So other infrastructure is needed: buffers. The problems start again if people add to much of it:
http://www.youtube.com/watch?v=qbIozKVz73g
If you want fast encryption everywhere. Chrome-developers seem to be very much interrested in that.
They created SPDY, which is faster than regular HTTP for 98%+ (or something) of the webpages: http://www.chromium.org/spdy
The problem is, it needs a server change. And currently only one browser supports it. Some might still consider it beta.
It seems Amazon might be using SPDY for Amazon Silk: http://aws.amazon.com/amazonsilk-jobs/
Google also tried many ways to mean SSL/TLS faster, but so far only one they found was compatible and helped enough, FalseStart:
http://blog.chromium.org/2011/05/ssl-falsestart-performance-results…
An other way to speed up delivery of secure content is, a Microsoft research project:
http://research.microsoft.com/apps/pubs/default.aspx?id=148963
Which allows for mixing of HTTPS and HTTP.
Where HTTP is used to download images/JavaScript/StyleSheets from a Content Delivery Network (CDN) but the those parts are signed with the same certificate as is used by the website which uses HTTPS.
Microsoft research paper uses the mechanism from Mozilla https://wiki.mozilla.org/Security/CSP to signal which sites are allowed to be loaded from HTTP and should be signed by the main site.
I’m using nightly for some time now, keeping the browser open for days and memory usage stays low. Very stable and the only thing I really miss is silent updates.
Almost all extensions I use (around 10) are compatible.
Good job Mozilla.
I’m with a lot of users who think these inflated version numbers are absurd. In a few years, we’ll be testing Firefox 400 and wondering, WTF!?!? I admire other FOSS projects like Inkscape whose community are conservative with their version numbers because they realize that the code simply isn’t mature enough to warrant boundless point releases. It’s a mistake to believe that increasing a version number for the sake of version numbers is wise. All it accomplishes is leading users down a disappointing path where they quickly lose faith in the project when expectations don’t coincide with reality.
Chrome has far higher version number than Firefox, and a more rapid release cycle.
Mozilla OTOH has proposed: Extended Support Release.
http://news.cnet.com/8301-30685_3-20109245-264/mozilla-proposes-not…
This proposal addresses your faux-issue whilst still allowing a rapid development cycle.
Enjoy.
A Long term, sorry extended support release should have at least a 2 year lifespan.
Ok, I know that 42 is the answer to life etc but obviously someone at Mozilla thinks that life is too short. And, by a long way.
Here is the Extended Support Release (ESR) proposal wiki:
https://wiki.mozilla.org/Enterprise/Firefox/ExtendedSupport:Proposal
Please contribute your undoubtedly invaluable ideas on how Firefox can be more responsive to the changing web than IE, yet less volatile than Chrome, in order to satisfy everybody.
Just quietly, it looks to me as though the wiki, so far, has a far better proposal than you do.
Firefox 7 apparently uses Direct2D on Vista and Win7. Anybody do any comparisons with Firefox 6?
Graph here:
http://www.ghacks.net/2011/08/16/firefox-7-may-ship-with-new-graphi…
Discussion here:
http://www.basschouten.com/blog1.php
Earlier technical discussion here:
http://blog.mozilla.com/joe/2011/04/26/introducing-the-azure-projec…
Edited 2011-09-28 01:39 UTC
Wow, such a shame that Linux users can’t get these perf gains…
Edited 2011-09-28 06:43 UTC
For Firefox 7 the new Azure graphics works only for Canvas elements, and only on Win 7 and Vista. Mozilla have written the Direct2D backend for Azure first, and they are still working on the OpenGL and Direct3D backends. This is perfectly reasonable, given the relative numbers of Windows users of Firefox compared to the number of Linux users.
This is somewhat similar to the situation that occurred with hardware acceleration, wherein the Windows version had hardware acceleration in Firefox 4 but this did not appear in the Linux version until Firefox 6.
Hence the beauty of the Firefox rapid release cycle. One can expect the Linux version to have Azure graphics improvements via an OpenGL backend in a few months time.
Thanks for your concern, but it really wasn’t necessary.
Edited 2011-09-28 07:09 UTC
Firefox devs made the right choice. Vista/Windows is by far their largest segment of market share, and Direct2D is ready now; whereas, an implementation built on OpenGL/Linux will be an ongoing work in progress….
Not to worry, I can wait 12 weeks for it to appear in Linux versions of Firefox. With a bit of luck I might have to wait only 6 weeks.
Alas, the perf won’t be as good.
Yeah, shame we can’t show a fishbowl with 1500 fish in it smoothly….
Seems to mostly be useful for really intense HTML5 canvas games of which there are…not many. By the time there are,the OGL support is probably done.
I really don’t use FF much anymore, but I did a test (12 tabs) and the new version dropped from 350MB to 250MB memory usage. Not too shabby!
I’d also like to note that my Chrome is using 150MB but also has a 36MB render process for each tab I have open (18, non with flash) and six worker processes using 20MB each. Total 918MB!
This is not a scientific comparison by any means since the tabs are not the same on each, but they are all just non flash web pages. Even with all that memory being used, chrome’s UI still responds faster that FF’s.
I’m running OS X 10.6
yeah that’s because chrome uses separate processes for everything. more memory but more responsive.
firefox still uses threads (excepts for plugins), less memory, but hard to get as responsive. that being said its rare that the UI slows down.
http://lifehacker.com/5844150/browser-speed-tests-firefox-7-chrome-…
“Obviously, there’s a lot more to browser choice than speed—variety of extensions, customizability, and so on—but when it comes to performance, here’s how our favorite browsers. We gave each contender a point value for its placing in each of the above tests, then tallied up the totals and divided them by the total number of points each could have received.
Opera 11.51: 82%
Firefox 7: 73%
Internet Explorer 9: 47.5%
Chrome 14: 43%
Opera’s still the speed champion, as usual. The other browsers have surprisingly switched places, though: Firefox, so often looked down upon for its sluggish speeds, has jumped up to #2 with version 7, and Chrome has slowly worked it’s way down to last place. Granted, Chrome’s inability to complete the CSS test probably pushed it over the edge from 3rd place into 4th. Still, we have seen Chrome slowly gain bloat with each new version, and boy did it feel slow when we ran these tests. Hopefully they can take a hint from Firefox and make a comeback soon. “
Edited 2011-09-28 03:38 UTC
Actually Mozilla developers are working on responsiveness:
https://blog.mozilla.com/tglek/2011/05/13/firefox-telemetry/
http://blog.mozilla.com/ted/2011/06/27/measuring-ui-responsiveness/
http://autonome.wordpress.com/2011/02/21/tracking-firefox-ui-respon…
You too can help make Firefox faster:
https://blog.mozilla.com/tglek/2011/09/20/firefox-7-telemetry-faster…
I’ve got 16 tabs open ATM and I’m getting 370Mb, and that’s with a few addons, too. That’s a lot less than it previously was, I think it hovered at around 600-700 megabytes.
So, memory-usage-wise this update atleast was good.
And it’s a big improvement over 6.
I’m getting 15% less memory at launch, and 30% less after some time goes by.
More importantly, it stays much more responsive over time. It’s still not quite at Chrome-like levels of responsiveness, but it’s getting much closer.
Oh, and to all the people complaining about version numbers: just go away and use another browser if you hate it so much. The whole point Mozilla is making is that you shouldn’t care what the version is, just like with Chrome. They are working on making it silently update, that just hasn’t come out yet.
I, for one, am very, very happy that these improvements that have come out over the last few months are available. Otherwise, we would all still be stuck on Firefox 4 for the next 12 months, and in comparison with the current version that would SUCK.
Edited 2011-09-28 03:46 UTC
Today I accepted the upgrade to Thunderbird 7, and nothing visible has changed.
However when it comes to Firefox I am still using 3.6.20.
I hate the frequent, meaningless interface changes and my add-ons being broken.
You can set the interface back to the way it was for 3.6.x, it isn’t even difficult, it takes only a few clicks.
You can use the addon compatibility reporter to enable addons which have not updated their compatibility information. Most of them will work. Most of them will have a new version available anyway, you just have to let them update.
You are missing out on a great deal by not changing to Firefox 7. Firefox 7 beats the socks off Firefox 3.6.20.
With v6 almost ALL ajax in the web site of the company where i work (that uses jQuery 1.3.2/1.4.3 for scripts compatibility) were not performing; or even performing the first time then stop working; at first it seemed was firebug because things seemed to perform just fine without and i upgraded it to a beta version, but later things were still not working; solution: had to downgrade to v3.6 in order to continue develop; a colleague of mine is still using v4 and i see that sometimes some styles break on a wordpress theme (in sidebars titles get bigger and misplaced).
I stopped using Firefox for development entirely. Using Chrome at the moment and obviously checking that stuff works in IE, Firefox, Safari as I go along … But none of them are that good IMO.
Chrome is the best … but I am worried I will end up “coding for webkit” rather than coding cross browser.
Edited 2011-09-28 18:14 UTC
They should just add “1.” at the beginning of the version number, then everyone worried about version creep can settle down. Of course that means they should also deploy the update as an update and not a full version.
I dislike the removal of http. Now when I leave, come back, look at the address bar and see a hostname, I have to think, hmmm what am I looking at. Lets see, this is Firefox, ohh right, it only hides parts of the URL when it’s http traffic, and I don’t see http, so it must be http (not https or something else.) It’s like only hiding certain file extensions. I beg for consistent behavior.
Starting with Firefox 8.
The extensions will start being treated as compatible by default, with certain one blacklisted if they are known to cause problems.
FF8 will bring a way to manage the addons installed outside the browser as well, like those Java addons.
Eventually the goal is to get all addons using the new JetPack API which will remain more stable, and allow for restartless installations.
Edited 2011-09-28 18:30 UTC
smitty,
“Starting with Firefox 8. The extensions will start being treated as compatible by default, with certain one blacklisted if they are known to cause problems.”
I’m running FF 9 Nightly alpha builds, and almost every single update breaks some addons. Just 5 minute ago I was prompted to update, but warned that it would break some of my addons.
http://www.screenshots.cc/photos/medium/50180-a1w8g.jpg
http://www.screenshots.cc/photos/medium/50179-nhsgj.jpg
If what you say is true, I’m curious why they haven’t applied this fix to FF9 Alpha yet. Mozilla hosing addons all the time sucks, both as a user and as a developer.
Here is a rational discussion of this:
http://blog.fligtar.com/2011/09/26/add-on-compatibility-progress-pl…
Here is the money quote:
“Compatibility-breaking changes in Firefox have been minimal in the first four 6-week development cycles, with the exception of two larger changes that were reverted because of their larger impact. Jorge has posted detailed guides to changes that affect add-on developers for each of those releases (5 & 6, 7, 8), and Firefox developers are often coming to us to discuss the impact of their changes on add-ons.
We’ve automatically bumped thousands of add-ons for each Aurora version and emailed developers with the results of our compatibility scanning. When Firefox 6 launched, 97% of add-ons compatible with Firefox 5 were still compatible with 6. And we’re on track to launch Firefox 7 tomorrow with 99% compatibility from 6.”
99% compatibility. Thousands of add-ons automatically bumped.
I think that says it all with respect to this issue.
It seems there is some kind of Internet meme agenda to try to make a mountain out of this Firefox extension compatibility molehill.
To be fair, most of the issues with incompatibilities are with addons not hosted by Mozilla. Those 97-99% rates go way down in the real world.
Of course, extensions like the ones Java, Microsoft, and others add usually aren’t very useful anyway.
You have no support for this claim.
To be fair, “Compatibility-breaking changes in Firefox have been minimal in the first four 6-week development cycles”.
To be fair, 99% of a large sample set of typical add-ons are found to be compatible (and marked as such via a simple automated process).
To be fair, all that is normally required to make an add-on which is not hosted by Mozilla compatible is to mark it as such by updating the compatibility data contained with the add-on.
Edited 2011-09-29 01:47 UTC
You put my words in quotes like you were taking them exactly, but that’s not what i said at all.
I said, they were going to start addressing the addon issue starting with FF8.
Then, separately, I said there was a plan to make addons compatible by default. Although I’ll admit i worded it a bit unclearly. I don’t know when that is supposed to take affect. Perhaps FF10, which is due to go into Aurora any time now.
The automatic compatibility also won’t apply to any binary extensions, but I think the vast majority would be included.
I’m sorry, I was too lazy there. I wasn’t commenting on your words, I was making a comment on the claim that “almost every single update breaks some addons”.
This wasn’t your claim.
While it may or may not be true, the way it was worded gives an utterly false impression of the scope of the problem. To get a real handle on the scope of the problem, a more representative statement is that “interface changes are minimal, and 99% of add-ons are compatible without requiring any change”.
That was my point. I should have removed the words quoted which came from you, because indeed your words had nothing to do with it.
Edited 2011-09-29 01:45 UTC
I was actually replying to Alfman and not you – unless you are the same person?
I think we actually agree on most things, as I was getting my info from that same link you posted earlier.
What I was specifically referring to was this:
So while the CURRENT system of auto-updating all Mozilla hosted addons may be hitting a 97% success rate, lots of Google Toolbar extensions are not included in that. It’s true that they generally just have to bump the version number to fix it – but that is something these 3rd parties are not doing in a timely fashion. Therefore, in the future the system will change to always assume the addon will work which should fix this problem.
Exactly so.
The only thing I would add to your erudite summary is that, since 99% of the 25% of extentions which are active on AMO are compatible between Firefox versions, this is easily a large enough sample to infer that roughly 99% of the remaining 75% of extensions which are NOT hosted on AMO should likewise be compatible.
I wholeheartedly agree that essentially “they generally just have to bump the version number to fix it”.
Where we perhaps may differ is that I would contend that this is not an issue for Firefox or Mozilla, it is rather an issue for users and providers of extensions for Firefox outside of Mozilla. Users should in the first instance get the addon compatibility reporter and mark their recalcitrant non-AMO extension as compatible. In 99% of cases that alone will do the trick.
Providers of extensions who want to make a good impression with their users should take that teeny, tiny bit of effort required and mark their extension as compatible prior to a new version of Firefox becoming available. I would be surprised if most reputable providers of extensions did not in fact do this.
Finally, perhpas what would be the most help of all, would be for users to understand that if they get an extension outside of AMO then it won’t update by itself. The user will have to get a new copy of the extension from wherever they got it in the first place.
smitty,
“So while the CURRENT system of auto-updating all Mozilla hosted addons may be hitting a 97% success rate, lots of Google Toolbar extensions are not included in that.”
I think that success rate is only true of release versions. With the alpha version of FF, I know firsthand that disabled plugins have been much worse. However as you indicate, this is because addon developers aren’t keeping the version numbers in synch with mozilla’s alpha builds, not because a genuine incompatibility exists.
“Therefore, in the future the system will change to always assume the addon will work which should fix this problem.”
I hope so, plugins should have never been tethered to FF version numbers in the first place. As long as the API is compatible, the plugin should load.
Addon developers who host with Mozilla no longer even need to update the version numbers. Mozilla is doing that automatically, then testing, and reverting back on the couple of percent that are broken.
I’m sure they don’t do that for the Alphas, though, so you aren’t seeing the benefit.
I’m sure that this is a source of confusion for people.
Almost all of the addons hosted at addons.mozilla.org (AMO) are automatically compatible. Users who get their addons from there will not see a problem. The addons will automatically update on the first run of an updated Firefox.
User who accept or install “addons from all over” will see at least some of their addons disabled after an update of Firefox. Many will simply assume that the Firefox update has “broken” their addons, without realising that if they update Firefox they have to update their non-AMO addons also. Manually.
Mozilla has no control over “addons from all over”. Mozilla can’t check an addon for compatibility if it is not known to Mozilla. Firefox has to assume that an addon which says it is compatible only with an earlier version of Firefox is not compatible with the newer version, even though it very probably is compatible.
lemur2,
“User who accept or install ‘addons from all over’ will see at least some of their addons disabled after an update of Firefox. Many will simply assume that the Firefox update has ‘broken’ their addons, without realising that if they update Firefox they have to update their non-AMO addons also. Manually.”
Well here’s the thing, all the addons I’ve ever installed were managed automatically by mozilla. The FF updates do regularly disable them. This may have something to do with the fact that I’m using an alpha build. I want to be absolutely clear that I am not using manually installed plugins through a third party, these are managed by mozilla themselves.
I switched to the alpha releases about 5 months ago to get the fixed memory leaks and never turned back, but this situation with plugins being disabled is more than a little annoying.
This is the situation according to Mozilla:
http://blog.fligtar.com/2011/09/26/add-on-compatibility-progress-pl…
I’ll quote some key points, and then comment.
“We’ve automatically bumped thousands of add-ons for each Aurora version and emailed developers with the results of our compatibility scanning. When Firefox 6 launched, 97% of add-ons compatible with Firefox 5 were still compatible with 6. And we’re on track to launch Firefox 7 tomorrow with 99% compatibility from 6.”
They bump (AMO hosted, non-binary) addons for Aurora versions.
“Our compatibility plan has two notable shortcomings: it doesn’t work for add-ons with binary components and it doesn’t work for add-ons not hosted on AMO.”
They don’t bump binaries. They can only automatically bump addons hosted on AMO.
“it was quite eye-opening when I learned that only 25% of the 600 million add-ons in use every day in Firefox 4 and later are active on AMO. That means at least 75% of add-ons aren’t getting the benefits of the automatic compatibility system we created”
The majority of addons in actual use aren’t ones that are hosted on AMO.
“More and more software tries to plop a toolbar into Firefox when you install it, often without asking you. Java Console alone has more than 100 million installations among Firefox users on Windows, and it doesn’t even do anything. While some of these add-ons are keeping up with our release compatibility, many use their own update mechanisms instead of the built-in update service that works with Firefox’s compatibility checking, so in order to get a compatible add-on, you must update the third party software separately.”
Addons which don’t use AMO aren’t updated automatically. They must be updated seperately.
“We’ve been working on a plan for fixing add-on compatibility that also takes non-hosted add-ons into account. Instead of working around Firefox’s assumption that add-ons are incompatible between versions, we’re going to teach Firefox to be a bit more trusting of add-on compatibility with new versions.”
In the future Firefox isn’t going to be as cautious, and it won’t any longer just assume binary or non-AMO addons are incompatible.
So …
If you are using AMO-hosted addons, and these are being marked as incompatible (even for Aurora), could it be because they are binary addons?
“In the meantime, if you find that your add-ons aren’t compatible, install the Add-on Compatibility Reporter and try ‘em out — there’s a good chance they work fine.”
If so, why don’t you try the the Add-on Compatibility Reporter?
Edited 2011-09-29 07:20 UTC
See, I think they are bumping the versions during Aurora development. And he is tracking Aurora, which means that when the channel has him upgrade to the new version they haven’t been bumped yet. Then over the next 6 weeks the addons are bumped and tested. If he moved to the Beta channel he’d get the benefit of having everything bumped before updating.
lemur2,
I’m now using FF10.
I installed addon compatibility reporter, however it just seems to allow me to submit reports of whether the addon works or does not work. I do not see a way to actually enable any of my disabled plugins to test if they work or not. What am I missing?
Ah wait! I installed “Nightly Tester Tools”, which has an option to “Force Addon Compatibility”, and all of a sudden all of my plugins minus one are running again!!! How I’ve missed you, greasemonkey!
Wow, hopefully FF10 doesn’t disappoint, cause this makes me a happy camper. To anyone else following this path, upgrading to FF10 did NOT enable the addon compatibility feature by default (10.0a1 2011-09-28), I had to manually enable it using the Nightly Tester Tools AddOn.
Ah damn, I found a glitch, I cannot install any of the addons which mozilla hasn’t marked as being compatible under FF10, even though they otherwise work through the forced compatibility mode. So guys, keep your old versions around to download new plugins.
According to https://wiki.mozilla.org/Features/Add-ons/Add-ons_Default_to_Compati…, automatic add-on compatibility is now “ready for resourcing and implementation”, which implies that implementation has not begun yet…
smitty,
“You put my words in quotes like you were taking them exactly, but that’s not what i said at all.”
I did quote you exactly, I added the quote to provide context. If I misinterpreted your post, then I apologize.
“I said, they were going to start addressing the addon issue starting with FF8. Then, separately, I said there was a plan to make addons compatible by default. Although I’ll admit i worded it a bit unclearly. I don’t know when that is supposed to take affect. Perhaps FF10, which is due to go into Aurora any time now.”
I have no idea what you are saying here, what does it mean to start addressing issues in FF8, but not have it go into affect until FF10? I honestly don’t know what you mean.
Ok, sorry, let me explain it a little better. I went back and realized my post was garbled a lot more than i had originally thought.
My thoughts started in the title:
And then continued on the next line:
That was one sentence.
Then, in a new paragraph, i said:
That’s not happening in FF8. Probably not in 10 either, but maybe 11. However, it is the current plan for dealing with extensions in the medium term.
Then I made a new paragraph, indicating a new thought:
Mentioning what it would start doing in FF8. This was just to indicate that movement was starting on the whole extensions issue starting in FF8, and would continue in the future in other ways.
And then i finished with another new paragraph mentioning some of the long term plans that would happen in later releases.
Here’s what the current release plan says.
In firefox 8, they begin warning the user on update if an add-on is incompatible, allowing them to cancel the update.
Starting from firefox 10, add-ons will begin to default to “compatible”.
See : https://wiki.mozilla.org/Features/Release_Tracking
Neolander,
“In firefox 8, they begin warning the user on update if an add-on is incompatible, allowing them to cancel the update.”
This is how FF9 works too.
“Starting from firefox 10, add-ons will begin to default to ‘compatible’.”
I’ll probably cross my fingers and make the jump.
Here is a direct link to the new compatibility feature:
https://wiki.mozilla.org/Features/Add-ons/Add-ons_Default_to_Compati…
Firefox 10 indeed : https://wiki.mozilla.org/Features/Add-ons/Add-ons_Default_to_Compati…
(By the way, I strongly recommend having a regular look to https://wiki.mozilla.org/Features/Release_Tracking, it’s nice of Mozilla to present the planned evolutions of each release so much in advance)
I wish Firefox could be installed on windows with a mingw/mingw-i686-w64 compilation (preferably with gtk+3 and gail) to move away from windows SDK, and without that nspr precompiled lib (also should be compilable from source). Cmake or configure could be an option.
This would be an also cool announcement for me. I do not have VS and I do not intend to use it.
The new release also is a memory hog but less than previous 6.x series. It is an improvement.
http://lifehacker.com/5844150/browser-speed-tests-firefox-7-chrome-…
Memory Use (No Extensions)—Winners: Opera and Firefox!
http://cache.gawkerassets.com/assets/images/17/2011/09/memorynoex.p…
Memory Use (Five Extensions)—Winners: Opera and Firefox!
http://cache.gawkerassets.com/assets/images/17/2011/09/memoryfiveex…
(Note that IE9 doesn’t have extensions)
If the new release is a memory hog, then all current popular browsers except possibly Opera are even more of a memory hog.
All Firefox vs. Chrome dick-measuring is pointless, both of them are amateur-hour toys given the lack of 64-bit support on Windows. For all of the knee-jerk, anti-MS whining about how far behind IE is, there have been 64-bit versions available since at least IE7. C’mon guys, at least try to keep up LOL!
Until Google or Mozilla manage to reach at least the same level that IE was at 3-4 years ago, I’ll stick the vastly-superior IE 10 thankyouverymuch.
I think that Chrome 32 bit is faster than any IE version, be it 32 or 64 bits. It has sandbox, lots of addons including adblock, so for many people Chrome is the best browser.
Your information is out of date. Chrome is no longer faster than IE9. Firefox 7 is faster than both Chrome and IE9, and it has vastly more addons.
I actually run Firefox 7 64-bit version under Linux, so I’m not exactly sure what the issue is with the lack of a 64-bit version of Firefox for Windows. Perhaps it is the same issue that makes 64-bit IE9 have abysmal performance.
Edited 2011-09-30 00:58 UTC
Since when do people still use Windows, much less run 64bit Windows?
Wow. I’d heard that there are people who spend so much time on slashdot that they start believing their little echo-chamber actually reflects the real world… but this is the first time I’ve actually encountered one.
I have to thank you for the good laugh, though. The size of Windows’ userbase mocked by an advocate of Linux, when even Vista (the least-successful Windows release ever) has almost 6 times as many users? LOL!
IE9 64 bit support sucks – they didn’t integrate their new javascript engine with it, which means you get comparable performance to Firefox from 5 years ago.
Has IE10 fixed that? I know Mozilla setup some Win64 test machines somewhat recently, so that support may be coming soon.
Firefox on Android uses the most memory and is the slowest Android browser. I hope they can fix that.
It looks like firefox now passes the acid3 test. Not that it matters much for real use. I can’t remember any site that have looked bad in previous version either, but still nice that it now passes.
Actually, it was the acid3 test that changed. The part of the acid3 test that Friefox used to fail was SVG fonts. SVG fonts is basically broken. Chrome, Safari and Opera paid lip service to it anyway, and implemented just enough of SVG fonts to pass the acid3 check of it. Firefox and IE9 didn’t do that.
What Microsoft and Mozilla actually did was come up with a better solution. They collaborated! The result was WOFF.
http://en.wikipedia.org/wiki/Web_Open_Font_Format
“The format has received the backing of many of the main font foundries and has been supported by Firefox since version 3.6, by Google Chrome since version 6.0, and by Opera in version 11.10. Microsoft added full support for WOFF in the third platform preview of Internet Explorer 9, and Presto supports WOFF as of version 2.7.81. WebKit development builds support WOFF, and support for it has been announced on Mac OS X Lion’s Safari.”
Given the widespread support for WOFF and the fact that it is now in the process of becoming a W3C recommendation in its own right, SVG Fonts are effectively redundant. No longer required.
Since they are basically irrelevant, the people who wrote the acid3 tests removed the tests for SVG fonts.
http://en.wikipedia.org/wiki/Acid3
“According to Mozilla employee Robert O’Callahan, Firefox does not support SVG fonts because Mozilla considers WOFF a superior alternative to SVG fonts. Another Mozilla engineer Boris Zbarsky claims that the subset of the specification implemented in Webkit and Opera gives no benefits to web authors or users over WOFF, and asserts that implementing SVG Fonts fully in a web browser is hard because it was “not designed with integration with HTML in mind”.
“On September 17th, 2011, Ian Hickson announced an update to the Acid3 test. He claims that removing the parts of the test that check the implementation of features likely to be removed or heavily modified in future specifications will allow those specifications to change in the way they should, without regard to what Acid3 tests. As a result, the latest versions of Firefox and Internet Explorer achieved a score of 100/100 on Acid3.”
Edited 2011-09-29 23:39 UTC