Linked by Thom Holwerda on Sat 8th Oct 2005 18:40 UTC, submitted by anonymous
Java Programmers agonize over whether to allocate on the stack or on the heap. Some people think garbage collection will never be as efficient as direct memory management, and others feel it is easier to clean up a mess in one big batch than to pick up individual pieces of dust throughout the day. This article pokes some holes in the oft-repeated performance myth of slow allocation in JVMs.
Permalink for comment 42284
To read all comments associated with this story, please click here.
RE: MMM steaming Java!
by Simba on Sun 9th Oct 2005 20:05 UTC in reply to "MMM steaming Java!"
Simba
Member since:
2005-10-08

"Have a look at Javac (java compiler (makes byte code)), it has ZERO optimisations not even dead code removal!"

Wrong. javac does remove dead code. In fact, you can make a block of code conditional on a constant, and javac is smart enough to know that the code will never be called if the constant condition is not met, and will remove that code. This is in the cases where javac will even let you include "dead" code. Much of the time, javac will flag it as an error. (Example, you cannot include assignments or executable statements after an unconditional return.)

Also, javac can, when possible, inline methods that are declared as final.

"instead all this is left for the HotSpot stuff to work out."

The only problem with HotSpot right now is that optimizations are not persistant across program executions. Although the word is that they will be in a near future release of the JVM. Optimization can usually be done better at runtime. In fact the JVM can make optimizations that C and C++ compilers cannot even dream of based on the fact that it can profile code at runtime and optimize thins. This is why some algorithms have actually been shown to perform better in Java than in C++. The JVM profiles the code at runtime, figures out what it is actually doing, and optimizes it based on that.

Reply Parent Score: 1