Legacy Refactoring

The past is not dead. In fact, it’s not even past. –William Faulkner


Whereas model transformation follows the arrow of time, taking expectations and commitments one step further towards application deployment, refactoring goes the other way, taking back past implementations in a bid to understand their purpose and keep the memory alive. Likewise, contrary to reuse which stands on transparent and secure foundations, refactoring delves into unfathomable, unreliable or even decaying code in the hope of retrieving reusable nuggets.

Refactored Legacy (E. Lusito)

Theoretically, refactoring could target different levels:

  • Reuse of code within a different technical context.
  • Reuse of models to produce new implementations.
  • Reuse of requirements to be realized into a different architecture.

Yet, with reuse of code is a lesser evil, and reuse of requirements a pipe dream, model based refactoring should be the solution of choice.

Reverse Engineering: Full Throttle Backward ?

Set within a model transformation perspective, that can be done from design or analysis models:

  • Implementation: legacy code is wrapped into a new design and redeployed as it was.
  • Source: legacy code is redesigned and redeployed.
  • Functionalities: applications are redesigned.

The objective is to consolidate the different refactoring policies and put legacies and new developments within a single framework.

Putting Past and Future under a Single Modeling Roof

Model Driven Refactoring

Obviously one cannot expect to find legacy systems neatly organized along architecture layers, and the first assignment is therefore to inventory legacy components as deployed, and try to map them against basic architectural tiers. For that purpose it will be necessary to determine whether components make use of persistent information, and whether they are executed locally or may be shared. When components cannot be fitted to a single tier they are assigned by default to the upper one.

Model Driven Legacy Refactoring

The next step will be to consider components at code level in order to extract design backbones (PSMs) and try to map structures to architecture patterns like Boundary/Control/Entity (aka Model/View/Controller). At that stage a repository of attributes and operations should be established and mapped to features and functionalities as documented by business requirements (CIMs).

Finally, the objective is to match the outcome to the functional architecture as defined by platform independent models (PIMs).

Further Reading

External Links


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: