Refactoring software by hand can be a real pain. So why not just automate the process? In this chapter, Joshua Kerievsky explains the whys and hows of refactoring, and why you shouldn’t trust that automated refactoring program.
Introduction to Refactoring to Patterns
2005-01-30 General Development 7 Comments
What an excellent article! I’ve actually been looking for a good resource on design patterns, and your link to informit seems to have ended my search. Great job- keep it up!
Pretty good. But isn’t there software development were you write the unit tests, and then the tools help write the code to the test? Also how much refactoring would be eliminated if more work was placed further up the chain. e.g. clearer specifications, more planning? And last don’t some languages have less of a need for some patterns?
I think you are missing the concept here. Refactoring isn’t something you do in development, it’s what you do after you have completed the project. No matter how much planning you do, after everything is completed, you can go back and improve on what you have done.
Times change, requirements change, programs change. Refactoring is useful there. And testing is a big part of refactoring.
As far as some languages having less of a need: Completely untrue. All languages can benefit from patters. This assumes that you take the word pattern to be what it originally meant, and that is something that is commonly done in programming, and putting a name to it. Every language has patterns, just not all languages use the same patterns. Just because Java benefits from one language doesn’t mean it will benefit from a pattern used in Python.
At their root, patterns aren’t written in stone. They are concepts, and how to solve common problems. Their is no “one way” in the world of patterns. However, when I say Singleton, the concept is easy to understand for everyone that understands patterns. While my implementation may be different than yours, the concept, the problem it solves is the same. This makes communication easier. I can say “Yes, I’ve implemented that using a singleton.” And I won’t have to explain it any further.
The whole point of extreme programming is not to do too much planning up front. Requirements change. The waterfall method of software development never really improved things. As Brooks said, “Plan to throw your first one away”.
I guess people like Fowler would say that if you get something small up and running and continually refactor then you don’t have to throw your fist pass away.
But as Jason pointed out, categorizing design patterns at it’s heart is a way for developers to communicate their intentions without waving their hands around and drawing things on a white board everytime.
Languages like Smalltalk have had environments which encourage refactoring and exploration a decade or so before Java was even around.
Somewhat offtopic, but I hope that prototype-based languages make a comeback eventually. Sun research worked on a prototype-based language called Self before Java got popular and then pretty much abandoned it. I consider prototype-OO programming to be a philosophically more natural way of doing OO. There have been many studies done that conclude that humans don’t really classify, but instead look at prototypical examples.
refactoring is done not AFTER you’ve done software but WHILE you’re making it. at least that’s the way it SHOULD be done, to make sure your code is in the best shape possible anytime.
The sooner you fix an error, the less it costs.
Thus refactoring while making may make sense.
After its finished the software drops into a maintenance mode. Refactoring can also be very useful at that stage.
Refactoring can be done at any stage of the software life cycle, it depends upon financial aspects as to its viability.