Promoting DQ Tools for Data Migration Projects

data-quality-tools-data-migration

One of our members recently had a tough time convincing her peers and sponsors of the need for a DQ tool on their forthcoming data migration project.

She had been pitching the benefits primarily of cleanse, citing the fact their migration tool had limited DQ capabilities so an additional toolset to manage the more complex transformations was required.

The source systems were largely consumer addresses, their business being B2C in the financial sector. The argument she came up against was that they already had a profiling tool so why couldn’t that help solve the problem?

Here are some of the additional areas that DQ tools can help increase the success rate of any data migration project.

Cost and Scope Analysis

If you work in the data migration industry then you are no doubt aware that most migrations are poorly forecasted both in terms of cost and duration.

In one case I witnessed an organisation using function points as a measure for effort and cost. This was without even looking at the data! Such practices are common where there is a poor understanding of the entire migration process.

Having a DQ tool that provides data discovery, profiling and cleanse early in the project means that you can perform some really essential activities:

  • Reality check on the actual number of business objects versus the count of variations in the database (for example, one business thought it would need to build 7200 equipment classifications, de-duplication and cleansing discovered they actually only 750, this meant a 10x reduction in analysis and development)

  • Better metrics for load volumes, understanding exactly how many objects to load is critical for determining your go-live strategy and window of opportunity for migration, this needs to be understood at the very outset of the project

  • Improved resource allocation by understanding what type of data defects will require automated, project team or business user intervention you have a much clearer idea of when these need to be deployed, critical for giving the business prior warning

Migration Simulation

One of the most useful techniques I have practiced in recent years is to perform a “virtual migration” of the legacy data to the target environment. This approach effectively de-risks the migration well in advance of committing a major spend.

The key to simulating the migration is to understand your key business building blocks, those business objects that are pivotal to success. Customer, location, order, equipment – define these in a simple conceptual and logical model first. Then define their physical relationships by using data discovery which underpins the entire migration process. Using our DQ platform we can then simulate the likely issues when consolidating disparate data in our legacy world and performing basic transformations before loading into a dummy target environment.

Using this simple technique prevented one organisation moving ahead with a migration that was simply not viable under traditional approaches, it therefore provides excellent intelligence gathering.

Data Quality Rules Management

Regular readers will have seen the many discussions and tutorials we have published on the subject of managing data quality so it goes without saying that a basic data profiler is insufficient to manage the full spectrum of data quality rules. You will require additional cleansing and transformation capabilities plus a means of managing a library of possibly hundreds of different data quality rules as they are developed.

Migration Execution Sequencing

How do you know which data will be eligible for migration? DQ tools provide a vital cog in the intelligence gathering machinery because they can report on the health of data prior to migration. The alternative is to run the migration and handle the fallout of records in error. Not advisable so make use of those DQ tools to report on data that needs to be postponed or handled as a special case.

Independent Data Migration Validation

How do you know if your data migration has successfully moved data across to the target environment? A DQ tool is a great way to independently vet that what was migrated correctly correlates to the legacy environment.

Ongoing Data Quality Assurance

At this point you should have a vast armoury of data quality rules and metrics that were created during the project to help you validate both the legacy and target environment. Why not benefit from this resource and launch an ongoing data quality assurance programme?

There will probably never be a time where such detailed knowledge about the application, database, data and functionality is freely available.

Previous
Previous

Ensuring Data Quality in Data Conversion

Next
Next

Data Migration Effort Estimation: Practical Techniques