veryard projects - innovation for demanding changesystems engineering for business process change

refactoring

on this page
Review of Martin Fowler's book
other material
articulation
component-based business
legacy systems
maintenance
negative patterns
patterns of change
home
[veryard projects] [contact us]
Refactoring increases the articulation of an existing artefact.
Most discussion of refactoring focuses on OO software.
Refactoring is relevant to component-based software architectures, as well as whole business processes and organizations.
Refactoring can remove the traces of past errors, haste or compromise, and/or allow an artefact to accommodate new requirements. 

Book Review

Martin Fowler

Refactoring: Improving the Design of Existing Code

Addison-Wesley, 2000. 

ISBN 0-201-48567-2

buy a copy - uk
buy a copy - uk

Martin Fowler, already known for his work on Analysis Patterns, has now written a book on the patterns of code transformation, which he calls refactoring, following pioneering work by Ward Cunningham, Kent Beck and Ralph Johnson, among others.

Refactoring is a process of improvement to an existing software artefact. This can be a large system, composed of objects. Fowler concentrates on systems written in Java, but lots of work has been done on Smalltalk as well – there’s even a tool to support refactoring in Smalltalk.

A refactoring is a well-defined unit of the refactoring process. So Fowler provides a catalogue of these refactorings – which we can think of as patterns of code transformation.

Refactoring is primarily an internal change to an artefact – it doesn’t (or at least shouldn’t) change the behaviour. The refactoring method assumes that the system works more or less correctly, and includes thorough testing to compare the behaviour before and after the change. If the system or component doesn’t work properly in the first place, you can’t refactor it – you have to debug it or throw it away. And if you want your system to accommodate some new requirements, or change its behaviour, that’s not refactoring either – although you may well perform some refactoring as a prelude to or part of the enhancement. Indeed, Fowler argues that best way of enhancing a software artefact is often be to alternate refactoring with the functional increments, as shown in Figure 1.

Figure 1: Separate refactoring from functional enhancement

Metaflexibility

Instead of developing complex artefacts possessing some unmeasurable quality called flexibility, let’s produce simple artefacts that could be made flexible in specific ways as requirements emerge. The ability to become flexible is a valuable property in its own right, distinct from flexibility itself. I suppose we could call it metaflexibility – but perhaps it’s just another form of simplicity.

Although Fowler doesn’t have a word for this, he is a strong proponent of this principle.

Flexible solutions are more complex than simple ones. … To gain flexibility, you are forced to put in a lot more flexibility than you actually need. With refactoring you approach the risks of change differently. You still think about potential changes, you still consider flexible solutions. But instead of implementing these flexible solutions, you ask yourself, "How difficult is it going to be to refactor a simple solution into the flexible solution?" If, as happens most of the time, the answer is "pretty easy", then you just implement the simple solution. (pp 67-68) Performance

When Fowler says that refactoring doesn’t change the behaviour of software, this stipulation excludes performance and other non-functional characteristics. He acknowledges a concern about performance, but resorts to the standard claim that well-structured software will be more amenable to performance tuning.

Fowler also makes clear that his refactorings are designed for single-process software, and might cause performance problems for concurrent and distributed systems. "For example, in single-process software you never need to worry how often you call a method; method calls are cheap. With distributed software, however, round trips have to be minimized." (pp106-7) This is not a limitation of the refactoring approach overall, merely a reflection on the particular refactorings offered in his book.

Negative Patterns

Nearly four years ago, I came across the concept of Negative Pattern. I ran a session at OT97 about it, did some seminars elsewhere, posted a summary on my website, and then turned my attention to other matters.

In a design inspection, the reviewers are looking for opportunities to improve an artefact – such as a piece of software. This is essentially a pattern-matching exercise, in which each reviewer searches for familiar ways in which the artefact may be flawed – yielding error, inefficiency, inflexibility or some other negative quality. These negative patterns can be characterized not only by a familiar structure, but by a common source or motivation – the designer was trying to do something else, but it turned out like this. Fowler calls these Bad Smells. Chapter 3, written jointly with Kent Beck, identifies 22 of these bad smells. There are some additional Bad Smells implied by refactorings called anything like Remove XXX – for example, Remove Setting Method.
 

Duplicated Code Long Method Large Class Long Parameter List
Divergent Change Shotgun Surgery Feature Envy Data Clumps
Primitive Obsession Switch Statements Parallel Inheritance Hierarchies Lazy Class
Speculative Generality Temporary Field Message Chains Middle Man
Inappropriate Intimacy Alternative Classes with Different Interfaces Incomplete Liberty Class Data Class
Refused Bequest Comments as Deodorant    
Table 1: Fowler’s Bad Smells When you’ve recognized one or more negative patterns, what happens next? In some cases, the artefact may be poorly designed, and so riddled with negative patterns, that it’s best to start again with a new design – or even a new or retrained designer. In other cases, the negative pattern may be replaced by, or transformed into, a positive design pattern. This transformation can itself be formulated as a process pattern.

There is a great opportunity for tool support to detect these negative patterns – although they may require human confirmation. For example, a code analyser may be able to find chunks of similar code, and present them to a human inspector as candidate occurrences of the Duplicated Code pattern.

So is refactoring only needed because of past errors or haste?  Should an engineer be ashamed if his own designs ever need refactoring?  That's certainly how people have often written about refactoring.  For example, the Gang of Four book presents patterns as a way of avoiding future refactoring.  Fowler is careful to avoid this tone of voice.  There is no blame or criticism involved in refactoring - simply a recognition that an artefact now lacks a level of articulation that it now needs - and it no longer matters whether this articulation could or should have been anticipated.

From Legacy Systems to Components

Some readers may be interested in the possibility of refactoring large legacy systems into components, with strong encapsulation. Then you can make a refactor-versus-rebuild decision for one component at a time. Fowler mentions this as a promising approach for key legacy systems, but doesn’t provide detailed advice.

Although legacy systems may be the work of many different software engineers and team over an extended period, there is often considerable repetition in style – if you can find one occurrence of a negative pattern, you can often find many more of the same pattern. This suggests that refactoring may benefit from this repetition.

Legacy systems may have started out with separate layers, but the layers have now melted together like lasagne. Sometimes different teams may have had different notions of what the layers were supposed to be, sometimes the layers have been deliberately merged to fix some long-forgotten problem, and sometimes there never were clear layers in the first place. Among other things, refactoring may allow us to establish or restore a well-structured internal architecture, with the different responsibilities and aspects contained in separate layers.

From this perspective, the refactorings that may be of greatest interest are the big ones – which we might call architectural. The book contains a chapter on big refactoring, towards the end of the book, but most of the book is about small refactoring. The implication is that refactoring is done bottom-up – but of course it doesn’t have to be.

Granularity and Design

Refactoring generally leads to finer grained objects and components. It therefore provides an answer for all those hesitants who worry about component granularity. With certain important provisos, it is possible to start with fairly coarse-grained components, and then use refactoring to refine them later, according to need.

Some proponents of Extreme Programming (XP) might argue that even planning and analysis are otiose, because you can just code it, and use refactoring to clean it up later. This approach seems more plausible in some domains than others. In e-business, you might want your website design to be very fluid, because you don’t know what your requirements will be in three months time. But you might want your fulfilment and payment systems to be properly engineered. (That’s all very well, but we’ve seen people get into serious trouble because they didn’t think through

But Fowler isn’t a dogmatic XP fanatic, and he does not deprecate modelling. I guess he could be counted as a proponent of Moderately Extreme Programming. For him, modelling helps you produce clear and simple designs, which can then be refactored later as required.

Refactoring as Normalization

In some ways, refactoring a large software artefact resembles the normalization of a large data artefact. Just as normalization is supposed to preserve the underlying representational meaning of data while improving the data structure, so refactoring is supposed to preserve the (functional) behaviour of an artefact while improving its structure. Some theoretical work has been going on to prove that certain precisely defined refactorings are "safe" in this sense, and tools are being built that implement this notion of safety. Complex refactorings can be decomposed into a series of safe steps. Of course, this doesn’t mean that you don’t have to test the refactored software – but it provides an additional level of reassurance, particularly for safety-critical or business-critical software.

Final Remarks

Refactoring is a highly relevant technique for component-based software engineering. Martin Fowler has written a useful introduction to the subject, mainly aimed at OO programmers, but with some insights for component writers and system architects as well.

Martin Fowler has his own website for ideas and further resources relating to this book. http://www.refactoring.com

top

home page

contact us

veryard projects - innovation for demanding change
in asssociation with 
 
This page last updated on November 7th, 2000
Copyright © 2000 Veryard Projects Ltd
http://www.veryard.com/sebpc/refactoring.htm