Are your data migration sponsors creating another data quality halloween story?

If you are project sponsor of a data migration then you’re aware that to create a successful project (i.e. data migration doesn’t fail + target system receives operational data) you need good quality data.

This doesn’t magically happen of course. It requires a data quality management strategy of people, processes and technology.

You’ve got one of those right? No? Well let’s see what you need and why it’s critical to factor data quality management into your data migration project otherwise you’ll find yourself with a new halloween horror story on your hands.

Cue scary music…

Data Quality Skilled Practitioners

People are required to implement frameworks and best-practices. You need skilled practitioners to steer the ship and also help with the daily uplift, getting the legacy data (that has been neglected) up to a quality level that will get it across to the target. In return this will enable the new target functions to operate smoothly. Practically every part of the project team and certainly the business will have to be involved in this process at some stage.

Data Quality Processes Ensure a Smoother Data Migration Process

Data Quality processes ensure that everyone works to a system that generates repeatable results. A predefined set of operation procedures are essential for ensuring you’re not left with a data cleansing free-for-all or total chaos at run-time when every load keeps failing.

Data Quality Tools Prevent Time Wasting and Amateur Coding Hacks

Data quality technology ensures that you’re not wasting time coding scripts and hacks that will no doubt be thrown away at the end of the project and deliver far less capability than a dedicated data quality tool. That’s if you’ve even considered the need for some kind of data quality technology.

Let’s Break Down Your “We’re Not Doing Data Quality Management” Stance

As project sponsor you’re probably aware that negative data quality could rear its ugly head but for a variety of reasons (mostly fear and lack of awareness) you’ve decided not to entertain the need for data quality management.

By ignoring the need for the correct data quality people, processes and technology you are effectively saying:

  • I am sanctioning the risk of loading data into a target system that will fail the requirements of the transformation process
  • I am sanctioning the risk of loading data into a target system that will fail the requirements of the target business functions
  • I am sanctioning the risk of introducing extended project timelines, additional costs, reduced functionality and user dissatisfaction

Why not print out those 3 statements and put your signature under each one of them? How does that make you feel? Nervous?

When you dig deeper into the need for rejecting data quality management during a data migration it invariably uncovers fear of some kind.

Project leaders and sponsors are typically fearful of:

  • Extra costs
  • Project delays
  • Pull on resources
  • Embarrassment (e.g. what data quality defects will be uncovered?)

The problem is that these are exactly the results you’ll experience if you choose to ignore data quality management on your next project. It’s a given. I’ve seen it many, many times before and if you speak to any other senior data migration practitioner they’ll recount horror stories galore of failed projects where data quality was sidelined.

It’s also been statistically proven via research from leading analysts that if you ignore data quality management your project is far more likely to fail or screw up in a wild array of different ways.

Costing Out Your Data Migration Project

So, let’s take just one assumption, that of cost. Many sponsors think that data quality management is an unnecessary cost so let’s look at some outline figures.

5 person data migration project:

  • 2 x permanent staff
  • 3 x contract specialists
  • 2 x analysts/testers
  • ETL license – £2,000 per week
  • Servers/Infrastructure – £3,000 per week

Daily staffing burn rate: £3,000 per day

Total project burn rate: £20,000 per week

Projected 50 week project: 50 x £20,000 = £1,000,000

Now, ask any practitioner and they’ll tell you that they’ve witnessed projects take at least 50% of the forecasted timeframe to complete when data quality management is taken off the project schedule.

(In fact, by ignoring data quality management principles your original forecast will be complete guesswork anyway!).

So let’s put in some example delays that demonstrate how costs can dramatically increase when data quality is ignored:

  • 10% overrun x 50 weeks = 5 week delay = £100,000
  • 20% overrun x 50 weeks = 10 week delay = £200,000
  • 30% overrun x 50 weeks = 15 week delay = £300,000

However, we’re missing something else here in that data quality management doesn’t just prevent your project running over – it actually reduces the project time frame so the cost savings are even greater.

Data quality software can now be leased and you will not require a full data quality team for the duration of the project so the benefits far outweigh the weekly burn costs of adopting a data quality management strategy.

Data Migration Strategy – The Phase Imbalance

Most data migration projects operate an imbalanced plan. They have a short analysis and design stage then a much larger build, test, execute and reconcile stage. This is a flawed approach.

It is far more effective to place much more effort on the analysis and design stage (if you have the skills and methodology to get this right).

There is a tendency for project leaders to “get the project moving” as soon as possible and so the early analysis phases are often shortened because of a lack of (perceived) progress. I’ve also seen project teams hellbent on getting stuck into their ETL tool of choice because it makes mapping and designing transformations so easy. The problem is that you’re just pushing the “data quality dilemma” further downstream.

Data Quality Management is critical because it feeds into every phase of the project and as a result can even reduce the original forecast because:

  • Your design is based on a deeper understanding of the legacy data
  • You’re migrating data that has already been tested (by the data quality processes)
  • Your load processes don’t fail because data quality management dictates a zero defect at load policy (if you’ve followed best data quality management practice)

A Parting Shot at the ‘In Denial’ Project Sponsor

If you implement data quality management (people, processes and technology) correctly you get a double whammy. A shorter, more risk managed and cost effective project but also a far higher quality result in the target system, which is obviously why we’re doing all this right? Data migration is not some simple act of shunting data, it’s about transforming the business and if your data is screwed then, well, no business transformation.

So the next time the perceived thinking is to ignore data quality management on a data migration be sure to understand the emotional fears and qualify the reasoning using a simple business case.

If your sponsors are still not convinced, contact us and I’ll gladly share some data quality horror stories that should hopefully scare them into action.

Halloween is just around the corner so get ready.

Comments are closed