Linked by David Adams on Wed 3rd Aug 2011 16:50 UTC, submitted by _xmv
Mozilla & Gecko clones Mozilla Firefox has been listening to recent memory complains, and as a side effect tested the browser's scalability to the extreme with memshrink's improvements. The results are shocking: For 150 tabs open using the test script, Firefox nightly takes 6 min 14 on the test system, uses 2GB and stays responsive. For the same test, Chrome takes 28 min 55 and is unusable during loading. An optimized version of the script has been made for Chrome as an attempt to work-around Chrome's limitations and got an improved loading time of 27 min 58, while using 5GB of memory.
Permalink for comment 483487
To read all comments associated with this story, please click here.
RE[6]: Comment by Praxis
by smitty on Thu 4th Aug 2011 00:22 UTC in reply to "RE[5]: Comment by Praxis"
smitty
Member since:
2005-10-13

How about this, start with the low hanging fruit - when someone closes a tab how about reclaiming the memory then you might actually find that many of the complaints regarding Firefox would evaporate. The problem has always been the inability for memory to be reclaimed after a tab or window has been closed but the Firefox developers keep denying what is really the problem in favour of spending time in trivialities.

Btw, I once again expect Firefox 7 to be an entirely giant disappointment on Mac OS X just as previous versions have shown - once again unless you're a Windows user you're pretty much shit out of luck if you expect something half decent on your platform of choice.

That's exactly what the previously linked to blog was discussing. The issue was that memory got highly fragmented, so that even when a tab was closed and the memory freed a lot of it stayed around because there were 1 or 2 chunks still in use by the browser UI in that chunk of memory.

FF7 addresses that issue by moving all the browser UI memory into it's own chunks separate from the page content, and they saw massive improvements.

Of course, that wasn't the only problem and they are still going through lots of new ones, but perhaps before you talk about how they should start going after low hanging fruit you should do the research to find out that's exactly what they are doing.

Also, as far as the image decoding goes - i think the new plan is to leave them uncompressed for 10 seconds, after that the cache gets flushed. I think they have to be decompressed for things like scrolling, so even a fast decode is going to be slow if you have to do it 100 times a second.

Edited 2011-08-04 00:25 UTC

Reply Parent Score: 4