Linked by chandler on Sun 16th May 2010 19:28 UTC
Google Google is set to announce Android 2.2 at the Google I/O event this week and one of the highly anticipated features will provide a big boost for performance and battery life. Originally the Dalvik virtual machine was implemented as an interpreter, but now a JIT compiler will be used. Already benchmarks show a roughly 6x improvement in numeric performance with the new JIT. While this will make Snapdragon-powered phones like the Nexus One seem even more responsive it will have the biggest impact on lower end phones using ARM11-based chipsets. It remains to be seen how many existing models will receive upgrades to 2.2.
Thread beginning with comment 424991
To read all comments associated with this story, please click here.
Comment by cerbie
by cerbie on Sun 16th May 2010 21:32 UTC
cerbie
Member since:
2006-01-02

Future phones and apps are where it'll really be important. So far, Dalvik has been an interesting way to handle Java, and some types of apps are going to be about as good interpreted as compiled; but, interpreted code in general suffers from every unique instruction that has to be run. When different unique code gets run, or code that's just doing some API calls, JIT compilation and caching overhead can actually make it slower, if the interpreter was fairly efficient to begin with. Apps developed after this becomes standard will get to tweak their code from the start for the JIT-enabled engine, and really take advantage of it. OTOH, apps that do computation, branchy messes, and/or lots of IO in Android-Java, aught to immediately benefit.

Given the time and effort put into performance issues with the first two major iterations of Dalvik, I doubt there will be any drawbacks (if anything, peak performance would probably be limited, in favor of not screwing some cases up).

The way Apple's been acting lately, and with the general strangeness surrounding Intel/Nokia, this will be a feather in the cap for Android.

Reply Score: 4

RE: Comment by cerbie
by ale6 on Sun 16th May 2010 23:45 in reply to "Comment by cerbie"
ale6 Member since:
2010-05-16

I agree for I/O.

I've developed a Stardict parser for Android, and I need to parse 1.2MB of data for each request (I think that kind of data may be not so rare).

Sun's JVM took 1s on a regular dual-core laptop, when -- iirc -- pure Dalvik code took up to 15 minutes (maybe I miswrote something, but I tried everything -- it was a year ago).

Using JNI, everything went back down to a couple of seconds on a ADP1.

btw, for the record, there will be a Google I/O talk about optimizing apps for taking advantage of the new JIT.

Reply Parent Score: 2

RE: Comment by cerbie
by ndrw on Mon 17th May 2010 03:54 in reply to "Comment by cerbie"
ndrw Member since:
2009-06-30

It is true that JIT may slow down a branchy code, but that's not where the performance bottleneck usually lays in. Whenever the application runs slowly it is almost always due to some heavy data shifting going on. In other words, applications that currently consume 5% of CPU time will perhaps start consuming 10%, and these wanting 500% may find 100% enough.

Another thing to look at is the garbage collector - I'm not sure how Dalvik's one performs here but allocation tends to be a major factor in OO languages execution speed.

Reply Parent Score: 1