Linked by Thom Holwerda on Sat 8th Oct 2005 18:40 UTC, submitted by anonymous
Java Programmers agonize over whether to allocate on the stack or on the heap. Some people think garbage collection will never be as efficient as direct memory management, and others feel it is easier to clean up a mess in one big batch than to pick up individual pieces of dust throughout the day. This article pokes some holes in the oft-repeated performance myth of slow allocation in JVMs.
Thread beginning with comment 42314
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE: Java *is* (relatively) slow!
by Simba on Sun 9th Oct 2005 21:24 UTC in reply to "Java *is* (relatively) slow!"
Simba
Member since:
2005-10-08

Yeah, as others have pointed out, this link is worthless. The benchmarks are not even remotely accurate.

More controlled benchmarks have conclusively shown that in some cases, Java can outperform C++ because of the runtime optimization. As I said in another reply, Java can perform dynamic optimizations on code that C++ cannot even dream about since it is limited to only static optimizations at compile time.

Reply Parent Score: 1

dmantione Member since:
2005-07-06

Yes, Java is so fast that it can outperform C++ only in controlled lab environments. Yeah right, welcome to the real world!

Not only the benchmarks speak, the real world examples speak as well. Java is at this time no performance match for any traditional compiled language.

Reply Parent Score: 1

Simba Member since:
2005-10-08

"Yes, Java is so fast that it can outperform C++ only in controlled lab environments. Yeah right, welcome to the real world!"

Yes. I will tell you about the real world. In the real world, Java is mostly used for server applications that do 100% duty cycle, where startup time is not important, and where the JIT can work its magic. In the real world, these types of applications are horribly unsuited to C++, and are also dangerous to write in C++. In the real world, speed of development and security are usually more important than squeezing out a slight improvement in performance, even if such an improvement is possible. C++ has neither speed of development, or security. In the real world other common options for server side application development, such as Perl, are 10 to 15 times slower than Java. PHP fairs even worse than Perl when it comes to performance.

"Not only the benchmarks speak, the real world examples speak as well. Java is at this time no performance match for any traditional compiled language."

Incorrect. For most common operations, there is virtually no speed difference these days beween that operation performed in C++ and that operation performed in Java, once the JIT has been allowed to work its magic. Again, the main problem right now is that JIT optmizations are not persistant across program executions. This is detrimental on the desktop, but of no consequence on servers. However, that will be changing as well, Persistant JIT optimizations are on Sun's to do list.

Reply Parent Score: 1

Member since:

I only read claims about the shootout being useless but no arguments. At least it is an open benchmark and everyone is free to discuss its shortcomings on public mailing lists. You can submit solutions if you think the present ones are not optimal.

Anyway, the shootout results do not seem to be that unrealistic. Show me the (unflawed) benchmarks that indicate that Java could be faster than C++. I find it somewhat reveiling that from the perspective of the Java zealots every benchmark that does not select Java as the performance king is discredited. They ignore empirical results as well as the public perception. What is the name of the reality distortion field you are living in?

Reply Parent Score: 0

Simba Member since:
2005-10-08

"Anyway, the shootout results do not seem to be that unrealistic. Show me the (unflawed) benchmarks that indicate that Java could be faster than C++."

http://www.kano.net/javabench/

The author indicates which tests he used, what operating system he performed the tests on, and what optimization settings he used for both GCC and Java. Java outperformed GCC in many of the cases.

Another study showed that on a standard bubble short, Java outperforms GCC even when GCC has all optimizations turned on.

In their paper "Benchmarking Java against C and Fortran for Scientific Applications", J.M. Bull, et al. from the Edinburgh Parallel Computing Center found that Java performance was always within 20% of the same algorithims written in C or FORTRAN. The authors concluded that "On Intel Pentium hardware, especially with Linux, the performance gap is small enough to be of little or no concern to programmers." In this same test, Java actually outpeformed some C compilers.

So now tell me again that I am ignoring empirical results. You call me a Java zealot, but it sounds more to me like you are one of the Java haters, who will accept the results of any benchmark, no matter how flawed that benchmark is, to support your assertion that Java is slower than C++.

Reply Parent Score: 1

rayiner Member since:
2005-07-06

I concluded that the shootout was useless the minute I downloaded some of the Dylan benchmarks, and realized that my results were nothing like the ones published on the site. Apparently (at least in earlier versions of the shootout), the data set for the benchmarks was so small that the C versions would complete in like 0.01 seconds, while the Dylan version would take like 0.1 seconds. When I jacked up the number of iterations, the two versions performed very similarly. I traced the issue to the fact that the Dylan runtime took a fixed amount of time during program startup, which meant that the Dylan program didn't even get started by the time the C program had exited. Yet, does the user really care about a 0.1 second startup cost? Or is the user more concerned about performance with large data sets?

Beyond that, its fundamentally flawed to compare performance on a given benchmark when the various implementations are written by different authors. It's the opposite of the scientific method --- you change only one variable at a time! The fact that the Shootout has a lot of different benchmarks doesn't make up for that basic flaw. A shitty experiment repeated dozens of times is still a shitty experiment!

In my own experiments (I've been meaning to get around to doing a proper Lisp vs. Scheme vs. Dylan vs. C++ benchmark), I've written a version of Fannkuch (one of the benchmarks on the shootout site) that was as fast in Bigloo Scheme as in C++. Meanwhile, the shootout says that Scheme is 1/6 the speed in Fannkuch as C++. Whose conclusion am I going to trust?

Reply Parent Score: 1