Linked by Thom Holwerda on Thu 19th Jun 2014 09:58 UTC

Ever since we first saw ART appear alongside the release of Android 4.4 KitKat, we all knew that it would eventually replace the aging and relatively inefficient Dalvik runtime compiler. Well folks, the time is now upon us, as commits made late last night to the AOSP master branch show Dalvik getting the axe and ART being set as the default.

Should deliver some decent performance improvements. I tried switching to ART months ago but ran into problems with some applications not working properly. Has the situation improved? Are any of you using ART?

Thread beginning with comment 590948
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE: Re:
by CaptainN- on Thu 19th Jun 2014 17:25 UTC in reply to "Re:"
Member since:

Dalvik is the Java virtual machine with JIT. It's a fine strategy for running intermediate machine code (DEX in Android's case). AOT is just a different strategy. It compiles ahead of time, instead of generating machine code at runtime. Each has it's advantages and disadvantages, and AOT isn't always faster, despite popular belief. ART is faster in the specific case of ART vs Dalvik, because Dalvik was never particularly fast, especially compared with other JIT based VMs such as .NET's CLR, Sun's JVM, or Mono. Even Chrome's V8 (JIT) is faster than Dalvik. I'll bet some of these JIT Based VMs will remain faster than ART for a while yet, though Google will certainly keep working on it.

Reply Parent Score: 2

RE[2]: Re:
by moondevil on Thu 19th Jun 2014 17:39 in reply to "RE: Re:"
moondevil Member since:

AOT is always faster if used together with PGO, unless we are talking about dynamic languages.

JIT compilers need to compromise speed vs code quality.

Reply Parent Score: 3

RE[3]: Re:
by emmbee on Fri 20th Jun 2014 00:00 in reply to "RE[2]: Re:"
emmbee Member since:

It's not, though. AOT needs to make assumptions about the specific processor used (and even, nowadays, GPU). JIT, on the other hand, suffers the penalty of compiling every time the programs starts, which implies not doing time-consuming optimisations.
Considering the variety of CPUs and vector extensions available, especially on phones, delaying compiling (as in, compiling to machine code) until the runtime environment is known offers the best performance - except in the case of download then run immediately.

Reply Parent Score: 2