Performance testing is usually left for last in the application development cycle – not because it’s unimportant, but because it’s hard to test effectively with so many unknown variables. In this month’s In pursuit of code quality, Andrew Glover makes a case for performance testing as part of the development cycle and shows you two easy ways to do it.
In Pursuit of Java Code Quality
About The Author
Follow me on Twitter @thomholwerda
2006-12-19 1:32 amstestagg
I get the feeling sometimes, that people take this ‘don’t optimize too early’ mantra too literally, writing hugely un-performant (this is a word?) code in the belief that it will get optimized later. I think there needs to be much more focus on writing speed-efficient code from the start, and then applying optimizations on top of this. Of course, I don’t control any budgets on any software projects.
One thing that I really hate is the ‘hardware is cheap’ argument. It feels too much like an excuse for sloppy work to me.
2006-12-20 8:55 amevangs
No. If you haven’t got a program written, how can you know which parts are bottlenecks and need to be optimized? The short answer is, you don’t. You could spend hours optimizing a particular code path which on paper looks highly inefficient (i.e. O^n) and turning it into O log n (i.e. highly efficient). However, that means nothing to the final program as that particular code path was called once in the entire running of the program and n never exceeded 5.
You’re under the misimpression that slow code is “sloppy”. What exactly do you mean by “sloppy” coding? Most of the time code is slow because programmers use the simplest and most straight forward approach to solving a particular problem. Is that sloppy? If you’re developing a piece of software, would you rather debug code that is straight forward, or convoluted code that is “optimized”?
Premature optimization is the root of all evil. It’s like the old saying. “Cheap, fast or works: Pick two”. Most people are content with software that is cheap and works. If you want something that is fast and works, be prepared to pay. Or wait … and wait …
Hi, My name is Tim Holwerdi.
I am gonna tell you my last dream…
I am an Aszzhole in search of Notoriety…
I work in a Website that offers news of IT and Open Source.
I pretend that I do it for the sake of love for IT, but the fact is that, I am expecting good revenues for the
If not, why should I loose my time looking for IT news in other IT Web Sites that offer what I am not able to
offer… for the sake of these IT weirdos geeks and Open source-free computing fanboys…? c’mon…
I think I know more than the rest, of course… and I am always right!
Yes, I know more than anyone of you about Computers, and about anything else you can imagine! even If many people prove me the contrary, I am still right…
Me and my Mac go together everywhere, I even sleep with it, which is somehow problematic, cause as you can imagine, is not easy to have sexual relations tru an USB port, or a FireWire one… but I am in love anyway!…
Anything that is not Mac or commercial, is just wacko rubbish!
And, of course, is not going to offer me anything, because all these Open Source weirdos have no future, and are not gonna advertise in my site, or pay me money… I dont even talk about the FSF retarded hippies!
At best the big companies that now move to Linux, and pretend to be Open Source, worth a little bit, and may be a source of revenues in the future if the have some sucess…
P.S. Apple Rocks… Linux sucks… (MS is very good also, cause they have plenty of money, and are the pattern of our great western Businnes Economic and social system…)
These are neat looking tools, and I wouldn’t be surprised if I end up using them on a project. However, optimizing early is still a bad idea, for the same old reasons. Unlike functional testing, where it’s not acceptable to have calculations that are approximately right or features that work 75% of the time, performance testing is all a matter of degrees. Without well-specified performance targets, one could spend a lifetime optimizing just one application. On the other hand, well-specified performance targets are generally specified in terms of perceived performance for the end user, which means that you’re basically running the completed (or close to the completed) application to evaluate it’s performance. When you do this, you’ll typically find some specific bottlenecks which, when resolved, yield massive improvements in the overall perceived performance for [often] relatively small investments. If you optimize too early, you end up expending effort optimizing code that, in the overall scheme of things, is immaterial.
The other problem with optimizing too early is that [good] development practice involves lots of functional refactoring, so if you optimize early, you’re likely optimizing code that’s going to go away anyway.
That said, writing these sorts of tests as you go does provide you with a useful tool for optimizing later on. I would just caution against trying to change your code based on initial test results. A much more cost-effective approach is to identify common coding patterns and practices that tend to improve performance up-front (and identify anti-patterns that hurt performance), and to reflect these in your coding standards. This way, your code just kind of ends up generally performing well, excepting those bottlenecks you discover during performance testing.