How to Prevent ‘Data Migration Target Procrastination Syndrome’

prevent-data-migration-procrastination-syndrome

There are lots of reasons why data migration projects fail to come in on time and under budget. Technology, skills, methodology - they’re all common failure points.

But, as ever, it’s often the simple decisions based on flawed assumptions that wreak the most havoc.

When I discussed the 6 Classic Data Migration Mistakes Well Worth Avoiding, one of the failures that created pushback from the community was ‘#4: Waiting (Endlessly) for the Target System to be Ready’.

It is easy to assume that we need the target system before the migration to really get going so people often voice this opinion:

“We’re moving data right? So, we need a destination to put the damn stuff. Let’s just wait until everything is firmed up and then we can start doing the mapping and all that cool ETL drag-and-drop trickery. If we get started too soon then we’ll just have to scrap a ton of work and blow out our budget.”

The problem is they are taking a target-centric viewpoint. They are framing the migration from the perspective of the target API and schema structures which are often way behind in development.

But here’s the thing…

You never have enough time on a data migration. Ever.

No matter how much time and money you think you have, forget it. You need to crack on at the earliest possible convenience and that means launching ahead even if you don’t know what your target system even looks like yet.

How can you do this?

The approach I follow in this situation is to focus not on the data but on the business functions. If you’re running a grocery provisioning department now then you can bet when you move your data to the target system you’ll still be provisioning groceries. You won’t suddenly be sending your trucks to pick up passengers between stores. You won’t suddenly need to build a stock control system that handles car parts.

Your business functions will not change dramatically.

Sure, you may be collapsing legacy systems into a more modern target system that features greater automation or GPS vehicle tracking but your team will still be provisioning vegetables between depots and stores.

From there you’ll begin to create your logical data model and of course your physical data model if you want to go that far.

As you do this you’ll start to uncover complexities and challenges in your legacy landscape that will need to be resolved no matter what target system you migrate to.

For example, I know of one utilities firm which had about twenty seven legacy systems to migrate. Three of the systems were critical but it was found they lacked any kind of relationship key to bring them together. As a result it was clear there were major challenges to solve before migrating to any target system. It didn’t matter whether the target was SAP, Microsoft, Oracle or a filing cabinet - if you want to create a single system from many systems you need to create a single entity view.

Target System Function Modelling

When you have got a firm handle on the legacy business functions and models you can start to brainstorm what the target functions will look like. Of course, the target system implementation team will be doing a similar activity so you can just sit in on their sessions. Or if they’re just buying a Custom Off The Shelf Solution and are skipping a lot of this functional analysis then your functional discovery will be dependent on what the new system can deliver and the workshop sessions you have with the business.

Data Quality Assessment and Discovery

Once you have the business functions and logical models for source and target mapped out you can start to analyse the models for obvious conflicts. You will find missing entities for example, or relationships in the target that just can’t be supported. This kind of modelling gap analysis is why methodologies such as Practical Data Migration (PDMv2) are so successful - they force you to find major obstacles before you have even analysed the data.

In tandem with this modelling and function gap analysis activity I personally like to get started at the same time with some really intensive data quality assessment work. For example, I prefer to implement a ‘Pre-Migration Impact Assessment’ even before the migration has started.

Function Modelling and Data Quality as a Core Data Migration Activity

The reason why it is so useful to combine the modelling and data quality assessment activity together is that they both feed vital intelligence into the separate activities. For example, your modelling activity will hint at possible relationship obstacles which can be verified by the data quality assessment. Likewise, you may have a source-to-target modelling relationship that looks valid but you find that the underlying data quality just won’t support it.

When you have completed these combined modelling and data quality assessment activities you will have amassed a huge amount of information that can feed directly into design and build phases of the project so not only are you finding issues earlier but you’re actually creating a tangible asset to reduce the overall lead time of the project.

Function Modelling as a Design Driver

When the target finally gets firmed up and specified you will have discovered and resolved many of the key obstacles that would have previously been uncovered much later down the line.

What’s even more beneficial is by adopting this early startup approach means that your target architecture can even be influenced by your analysis, enabling the target system to adapt to the issues, structures and functional requirements that you found.

Previous
Previous

How to deliver a complex healthcare data migration

Next
Next

Data Migration Success: Are you and your supplier on the same page?