Linked by Thom Holwerda on Tue 9th Mar 2010 23:40 UTC, submitted by poundsmack
Mozilla & Gecko clones "Mozilla's high-performance TraceMonkey JavaScript engine, which was first introduced in 2008, has lost a lot of its luster as competing browser vendors have stepped up their game to deliver superior performance. Firefox now lags behind Safari, Chrome, and Opera in common JavaScript benchmarks. In an effort to bring Firefox back to the front of the pack, Mozilla is building a new JavaScript engine called JaegerMonkey."
Order by: Score:
Misleading title
by Fergy on Wed 10th Mar 2010 10:20 UTC
Fergy
Member since:
2006-04-10

The Ars article has a misleading title which is mentioned in their comments but OSnews just copies it?
As far as I understand the articles about JaegerMonkey is that they will borrow the interpreter of webkit. Right now when tracemonkey doesn't work it falls back to the Firefox 3.0 interpreter. In the future it will fall back to the webkit interpreter which means it will 'slow down' to webkit speeds ;)

My title would be: Mozilla borrows from WebKit to improve their JS Engine

Edited 2010-03-10 10:23 UTC

Reply Score: 1

RE: Misleading title
by lemur2 on Wed 10th Mar 2010 10:36 UTC in reply to "Misleading title"
lemur2 Member since:
2007-02-17

The Ars article has a misleading title which is mentioned in their comments but OSnews just copies it?
As far as I understand the articles about JaegerMonkey is that they will borrow the interpreter of webkit. Right now when tracemonkey doesn't work it falls back to the Firefox 3.0 interpreter. In the future it will fall back to the webkit interpreter which means it will 'slow down' to webkit speeds ;)


Webkit doesn't use an interpreter at all, it uses a compiler. A "just in time" compiler, to be accurate.

Because it involves an extra "compile step" (beyond the syntax parsing step) a compiler is actually slower than an interpreter for code that runs through just once. OTOH, because it does compile native code, a JIT compiler is much faster than an interpreter for any code that loops.

Mozilla's current Javascript accelerator, called Tracemonkey, is an interpreter with optimization via code tracing.

http://blog.mozilla.com/dmandelin/2010/02/26/starting-jagermonkey/
"About 2 months ago, we started work on JägerMonkey, a new “baseline” method JIT compiler for SpiderMonkey (and Firefox). The reason we’re doing this is that TraceMonkey is very fast for code that traces well, but for code that doesn’t trace, we’re stuck with the interpreter, which is not fast."

The concept for JägerMonkey is to use Apple's Nitro Assembler to generate efficient native code (i.e. use the same JIT compiler method as webkit), but also implement the benefits of Tracemonkey where it applies.

"The JägerMonkey method JIT will provide a much better performance baseline, and tracing will continue to speed us up on code where it applies."

The idea is to get the best of both approaches. In other words, this is fine example of the open-source "meritocracy" approach at work.

If it works, of course. If Mozilla run into snags, it might not work ... in which case the "meritocracy" approach would be to just drop it. In open source development, there is no point on hanging on to something just because it is your product. A "NIH" attitude doesn't really apply, or at least it shouldn't.

If Mozilla can pull this off, it has the potential to put Firefox back in the lead in the browser speed stakes.

Edited 2010-03-10 10:40 UTC

Reply Score: 4

RE[2]: Misleading title
by strcpy on Wed 10th Mar 2010 10:40 UTC in reply to "RE: Misleading title"
strcpy Member since:
2009-05-20

A "NIH" attitude doesn't really apply, or at least it shouldn't.


Hahaha. The best joke for a long time.

Reply Score: 3

RE[3]: Misleading title
by lemur2 on Wed 10th Mar 2010 10:53 UTC in reply to "RE[2]: Misleading title"
lemur2 Member since:
2007-02-17

"A "NIH" attitude doesn't really apply, or at least it shouldn't.


Hahaha. The best joke for a long time.
"

The joke is on you. Apple's Nitro Assembler, a part of webkit, which is clearly not invented at Mozilla, is going to be used as the JIT compiler for Mozilla's JägerMonkey, which in turn will be used in Firefox.

It turns out that the rendering part of webkit itself (but not Apple's Nitro Assembler) was based on KHTML, which was written by the KDE project, and is therefore a NIH component of Apple's webkit. It further happens that webkit is the rendering core of Google Chrome, which is yet another example of not following a NIH approach.

This is all a very good set of examples of open source development not adopting a "NIH" (Not Invented Here) attitude.

This sort of practice of using and re-using the demonstrated-best methods virtually defines the whole "meritocracy" approach.

Edited 2010-03-10 11:00 UTC

Reply Score: 2

RE[4]: Misleading title
by strcpy on Wed 10th Mar 2010 10:57 UTC in reply to "RE[3]: Misleading title"
strcpy Member since:
2009-05-20


This is all a very good example of open source development not following a "NIH" (Not Invented Here) attitude.


You know what, your precious Linux and GNU have always been about NIH. And remain to be about it.

Same goes for majority of open source software. Heck, even I've written many things simply because of NIH. There is nothing that wrong in it IMO, but it is ridiculous to deny the existence of it. In a way that's what FOSS is all about.

Edited 2010-03-10 10:59 UTC

Reply Score: 2

RE[5]: Misleading title
by lemur2 on Wed 10th Mar 2010 11:16 UTC in reply to "RE[4]: Misleading title"
lemur2 Member since:
2007-02-17

"
This is all a very good example of open source development not following a "NIH" (Not Invented Here) attitude.


You know what, your precious Linux and GNU have always been about NIH. And remain to be about it.

Same goes for majority of open source software. Heck, even I've written many things simply because of NIH. There is nothing that wrong in it IMO, but it is ridiculous to deny the existence of it. In a way that's what FOSS is all about.
"

Au contraire, it is proprietary code that defines the "NIH" mindset to a tee.

Now it is quite true to say that there is some component of NIH attitude in open source code, but it is exceptionally easy to show that it is by no means pervasive in open source.

GNU itself is a "work-alike" that implements almost an entire OS after the design template of POSIX. The missing bit of GNU is the kernel, called Linux, which borrows RCU, SMP and other methods from IBM's mainframe and OS/2 inventory (donated by IBM to Linux), and the likes of X, OpenGL and various filesystems from all over. The executable format is ELF, which is a standard defined by a UNIX consortia. GCC implements standard languages such as C and C++. Outside of that, contributed languages are Java, Python, Ruby, Perl, Haskel and a long list of others, all NIH. There is no "Linux-only language" anything like .NET.

Application file formats are either industry standards (such as ODF from OASIS), W3C standards such as SVG and HTML, or outside-contributed formats such as Ogg, MKV, PNG, JPG et al.

Virtually all of GNU/Linux is NIH. It is donated from all over. One of the few bits that I can identify that is unique to Linux would be the ALSA sound drivers.

Proprietary code, OTOH, all but defines "NIH".

Edited 2010-03-10 11:29 UTC

Reply Score: 0

RE[6]: Misleading title
by strcpy on Wed 10th Mar 2010 11:22 UTC in reply to "RE[5]: Misleading title"
strcpy Member since:
2009-05-20

lemur2, please don't be so naive. How many sound stacks? How many wireless stacks? How many window managers? How many distributions? All because of NIH.

Reply Score: 4

RE[7]: Misleading title
by lemur2 on Wed 10th Mar 2010 11:38 UTC in reply to "RE[6]: Misleading title"
lemur2 Member since:
2007-02-17

lemur2, please don't be so naive. How many sound stacks? How many wireless stacks? How many window managers? How many distributions? All because of NIH.


All clearly NOT NIH (apart from ALSA, which is the only thing with "Linux" in the name).

BTW - a Window manage is a small part of a wider program known as a desktop. Most Linux desktops these days can run X or GTK or Qt applications, even down to system-tray applets, via a common desktop API known as LSB ... Linux Standards Base. Likewise, there are different wireless manager programs that provide a user interface to wpasupplicant and the Linux kernel wireless card drivers ... different GUIs for the one wireless stack.

Distributions are merely aggregates of choices of desktop and application programs chosen to work well together as an integrated whole.

You have confused application choice for "NIH". Choice is not "NIH" ... in fact lack of choice of desktop for a base OS would be an example NIH.

It would appear that you are very easily confused.

Edited 2010-03-10 11:42 UTC

Reply Score: 0

RE[5]: Misleading title
by jaklumen on Wed 10th Mar 2010 11:17 UTC in reply to "RE[4]: Misleading title"
jaklumen Member since:
2010-02-09

"You know what, your precious Linux and GNU have always been about NIH. And remain to be about it."

That's funny, because a self-described "typical UNIX Silicon Valley professional" told me GNU/Linux was doomed because it was trying too hard to be like Windows.

And ow, man. Diplomacy. Do you speak it?

(I see lemur2's comment was posted as I was writing this-- a much more elegant response than mine, IMHO)

Edited 2010-03-10 11:21 UTC

Reply Score: 2

RE[6]: Misleading title
by Johann Chua on Wed 10th Mar 2010 18:37 UTC in reply to "RE[5]: Misleading title"
Johann Chua Member since:
2005-07-22

Forget it, he admits he's a Linux troll in his profile.

Reply Score: 3

RE[7]: Misleading title
by jaklumen on Wed 10th Mar 2010 23:34 UTC in reply to "RE[6]: Misleading title"
jaklumen Member since:
2010-02-09

Well then I offer him this: I've seen better. Yeah, really. Better trolling than what he's offered so far, IMHO.

Reply Score: 1

RE[5]: Misleading title
by Bill Shooter of Bul on Thu 11th Mar 2010 05:33 UTC in reply to "RE[4]: Misleading title"
Bill Shooter of Bul Member since:
2006-07-14

Eh, tough to tell. I could cite examples of code reuse, you could cite examples of duplications in functionality. There are many ways to do things, but many of them involve the same programs, the same code. Many different Media players, but they all end up using the same codec source code to decode ogg.

Could we move on to a more important debate? Like what end of an egg to open?

Reply Score: 2

RE[4]: Misleading title
by jaklumen on Wed 10th Mar 2010 11:14 UTC in reply to "RE[3]: Misleading title"
jaklumen Member since:
2010-02-09

"It turns out that the rendering part of webkit itself (but not Apple's Nitro Assembler) was based on KHTML, which was written by the KDE project, and is therefore a NIH component of Apple's webkit. It further happens that webkit is the rendering core of Google Chrome, which is yet another example of not following a NIH approach."

Thank you for pointing that out. It bothered me that the Ars Technica simply said "Apple's WebKit project" when I knew things hadn't originated with Safari, but with Konqueror.

Right now I am using the Linux (Debian) beta of Chrome for faster JS despite current limitations (errors with CSS rendering on refreshes, I believe). But I switch back to Firefox for a blogging site that is heavily based on a WYSIWYG editor. I have to use a script that enables me to edit the HTML of the entry manually to get formatting right sometimes, but for me and any of my contacts that use Chrome or Safari, formatting is even worse.

My preference was strongly with Firefox before and I welcome anything that improves JS performance.

Reply Score: 2

RE[4]: Misleading title
by google_ninja on Wed 10th Mar 2010 17:53 UTC in reply to "RE[3]: Misleading title"
google_ninja Member since:
2006-02-05

I would argue that since webkit is written better, performs better, and supports more then gecko, it would be in the products best interest to just drop gecko in favor of webkit.

Reply Score: 4

RE[2]: Misleading title
by voidlogic on Wed 10th Mar 2010 14:32 UTC in reply to "RE: Misleading title"
voidlogic Member since:
2005-09-03

>>Webkit doesn't use an interpreter at all, it uses a compiler. A "just in time" compiler, to be accurate.

Perhaps you do not realize this, but most JIT compilers augment an interpreter. Take a class written in Java or C# for example, if there is a method that gets called one or a few times, it is not compiled to native code, it is interpreted. Another method called a few hundred times will be compiled to native code.

Why you ask? Take any large managed code application, lets use Netbeans. Instruct the VM for the language (in this case the JVM) to compile *every* method before use. You will find for the first 10-60 minutes (depending on your machine) the software is unusably slow.

I don't know the details of the webkit JIT but it probably does the same thing. Then again, becuase the volume of code on a website is so small compared to an application, perhaps I am mistaken and it does not.

Reply Score: 2

RE[2]: Misleading title
by google_ninja on Wed 10th Mar 2010 17:56 UTC in reply to "RE: Misleading title"
google_ninja Member since:
2006-02-05

Webkit doesn't use an interpreter at all, it uses a compiler. A "just in time" compiler, to be accurate.

Because it involves an extra "compile step" (beyond the syntax parsing step) a compiler is actually slower than an interpreter for code that runs through just once. OTOH, because it does compile native code, a JIT compiler is much faster than an interpreter for any code that loops.


This is not totally accurate. They are using a "tracing" JIT, which does some code analysis and will compile the bits that make sense. What you are talking about is a static JIT, which is how googles V8 engine works.

Reply Score: 2

The other new direction for Mozilla
by lemur2 on Wed 10th Mar 2010 12:41 UTC
lemur2
Member since:
2007-02-17

Mozilla previews new feature to guard against Flash crashes

http://arstechnica.com/open-source/news/2010/03/mozilla-previews-ne...

It's part of a broader Mozilla project called Electrolysis that seeks to eventually bring full support for multiprocess browsing to Firefox. Electrolysis will make it possible for a browser crash to be isolated to a tab or group of tabs rather than affecting the entire browser. Similar functionality is already available in Internet Explorer and Chrome. Although Mozilla has already taken major steps towards implementing holistic multiprocess browsing, the plugin isolation is the only part that will land in the next release.


Although this feature is similar to functionality already implemented in IE and in Chrome, it turns out that Electrolysis is multi-process (process per tab?) rather than multi-thread.

This approach code-named Electrolysis should be a benefit to Linux, because AFAIK in Linux the overhead associated with a separate process is much less than the overhead for a separate thread.

Electrolysis is therefore a bit less anti-NIH than JaegerMonkey is.

Edited 2010-03-10 12:43 UTC

Reply Score: 0

_xmv Member since:
2008-12-09

I wish they only keep multiprocess to plugins - where it belongs. And keep multithreading and optimizing for the rest - as it should be.

Chrome's memory footprint with a few tabs is spectacular, the wrong kind of spectacular.

Reply Score: 2

voidlogic Member since:
2005-09-03

>>This approach code-named Electrolysis should be a benefit to Linux, because AFAIK in Linux the overhead associated with a separate process is much less than the overhead for a separate thread.

This is incorrect, kernel level threads almost always have a lower level of overhead than processes. Threads share the same memory space so when a context switch between threads occurs the TLB is not flushed. Also, communication between processes must use an IPC mechanism provided by the OS making it much slower than the synchronized data structures threads use to communicate. Also, for operating systems which do not use COW pages, process creation (forking) is extremely expensive in comparison to thread creation.

Reply Score: 5