Migrating to SAP SuccessFactors: Practical Advice for a Successful Data Migration

sap-success-factors

There is an increase in data migration projects to cloud target applications in recent years.

So we spoke with seasoned data migration expert, Miles Davies of DominusData, to find out more.

Miles has been heavily involved in cloud-based migrations in recent years, particularly with SAP SuccessFactors.

Here is a summary of his IBM SuccessFactors ‘10 Top Tips’ and a further in-depth discussion on the topic of data migration in general.

Dylan Jones: I’m sure many of our readers are new to SuccessFactors, so what would be your ‘Top 10’ pieces of advice based on your experience?

Miles Davies: Sure, we can get into a broader discussion in a moment, but here are some to be starting with:

  1. Avoid an expensive ETL framework – it is not necessary. Use readily accessible and scalable technology with a wide resource pool of experts to leverage.

  2. If you plan to use two resources, you will probably need four. Don’t underestimate the complexity and nuance in the SAP SuccessFactors system when it comes to Data Migration.

  3. You should have people experienced in either On-premise HCM to SAP. SuccessFactors migration or at the very least On-premise to Cloud (via template) migrations. They will also need strong development skills in your ETL Framework. These people do not grow on trees, so if you need to go to the sub-contractor market, I would advise doing it soon.

  4. Have a Data Migration Lead who can engage at both a business and technical level. They will be the fulcrum to a successful Migration strategy. Ideally, they should have experience of SAP SuccessFactors and successfully completed at least 1 global deployment from a Data Migration perspective.

  5. Test, Test and Test some more. If your Data Migration team hasn’t asked for their own SAP SuccessFactors environment, I would strongly recommend providing them with one.

  6. If someone asks the question of Data Migration “How hard can it be?” you will need to invest the time to show them.

  7. With regards to Data Cleansing, if a client says “My data is fine” you can be certain that it isn’t. In this scenario, an initial piece of analysis work would be advisable to be able to give evidence of why it isn’t.

  8. Don’t undertake any Data Migration work if there is no time and budget allowed for Data Discovery.

  9. Team Leadership is critical. Data Migration is never 24/7/365; there are always lulls in activity. It is essential to keep the team motivated and tight. A poor leader is a huge risk in every measurable way.

  10. Data Migration is a thankless task. Get it right, and no-one recognises it, get it wrong, and you will be hung out to dry. Avoid parachuting in late and trying to fix – get in early and go the distance.

Dylan Jones: What is your background Miles? How did you get involved with SAP Data Migration projects?

Miles Davies: I spent many years in a variety of roles around the SAP ecosystem, but around 2005 I developed a real passion for Data, in particular, the cleansing and governance of Master Data. This evolved into what was, at the time, the new approach of Master Data Management, and specifically SAP MDM and the application of this solution into the Large Enterprise landscape.

Moving forward, the business world has woken up to the concept of ‘Data’ but are absorbed by the bright light conversations around Big Data, Predictive Analytics, Data Lakes, etc. This is all very important and interesting, but fundamentally all this ‘shiny’ stuff is supported by the ‘oily’ Master and Transactional data being in the right system at the right time.

With the increasing adoption of Enterprise Cloud solutions and the move from on-premise, a successful Data Migration is essential for the immediate and longer term Data related objectives of a client.

However, it is still the ‘ugly duckling’ that no-one wants to do, so I made the conscious decision to focus on this a few years ago, and the phone hasn’t stopped ringing since!

Dylan Jones: Getting back to the list you created. Let’s start with the first tip you mention which is: ‘Avoid an expensive ETL framework’. Can you expand on this because ETL software tools are obviously widely used on migrations and can certainly help to increase productivity based on my experience?

Miles Davies: That’s a great question and observation, and probably what I should have said is “don’t buy an expensive ETL toolset just to to do your Data Migration”.

By their very nature, Data Migrations are transient.

You Extract, Transform, and Load and in simple terms that is ‘job done, move on’. In these situations, there is no onward value of buying what can be expensive platforms if you plan to retire the framework post-Migration.

However, if you plan to use the extended capabilities of some of these toolsets for ongoing Data Governance, Data Maintenance, Archiving, etc. and the solution forms part of a strategic architecture then, of course, you should go ahead and use it from end-to-end.

I will always recommend the right platform approach for a client based on their immediate and longer term requirements.

Dylan Jones: I see, and I do know that message of capitalising on the all the efforts made to ‘lock-in’ the data quality improvements you made to the legacy data is starting to get through, it’s frankly crazy to waste all that energy improving the data only to watch it degrade again.

Absolutely, and it is the trap that companies fall into time and time again. There is an argument to say that a one-time data cleanse is a pointless and ultimately expensive activity. You need to build robust governance rules for the cleansing, managing, creation and deletion of Data – and stick to it.

Dylan Jones: Onto your other points and I agree with your second tip about staffing. What is the typical makeup of skills that you have on a data migration? For example, how did you staff your last project – what roles and number of staff did you have?

Miles Davies: The last project I did was, at the time, the largest implementation of SAP SuccessFactors in the UK (in terms of employees).

It was a complex programme with a number of firsts including:

  • First Enterprise Cloud solution and

  • First deployment of a single solution across multiple business units

So this was a high profile Programme within the company, and it was important to get it right and get it delivered on time.

We used a SQL ETL framework and the team that completed the Programme consisted of:

  • Overall Data Migration Lead (which was my role)

  • Data Architect

  • 4x Data Analysts/Developers

  • Business Engagement Lead

The Business Engagement Lead role is unusual, but because of the first in terms of single solution across multiple businesses, we spent a large amount of time in stakeholder management, data validation, update workshops and ‘room visits’

The skills make up was a mixture to give a very deep understanding of SQL and associate components, Data modelling and architecting including Conceptual, Logical and Physical models) plus the softer skills to help people understand the Data Migration and Validation process.

Dylan Jones: Great to see you using a Business Engagement Lead, traditionally very difficult to get the business involved, so I can see the advantage there. What was their role outside of the project – what was their ‘day job’ if you like?

Miles Davies: In this particular instance it was a specifically created role and we brought someone in from outside the organisation. This was to allow for some different thinking and experience elsewhere to be utilised in the communication planning and execution.

It is not always necessary to hire from outside but on this Programme it proved to be the better approach.

Dylan Jones: Your point about testing is well made so is there anything specific to a SuccessFactors migration that the client should be testing for?

Miles Davies: The great thing about SAP SuccessFactors, or any other SAP Cloud solution that I have worked with, is that you can have multiple environments or tenants – particularly if you are a Large Enterprise with a Private Cloud. It is vital to have a Data Migration test environment with  maintained configuration to act as your testing sandbox.

As with all Migrations, it is key to understand the inter-relationship of the Data Objects i.e. you can’t load that until you load that.

Referential integrity is everything, and with SAP SuccessFactors you have Foundation Objects and Worker Objects – in many respects, these could be loosely translated as Master and Transactional Data for those who work in Finance or Customer Data.

Without the Foundation Objects correctly loaded, you will get errors in the Worker Objects, but it is not linear i.e. 4 errors in X Foundation Object does not equal 4 errors in Y Worker Object.

This is why the Referential Integrity checks are essential.

Dylan Jones: Were there any additional testing strategies adopted that you recommend?

Miles Davies: As we went through our Mock Conversion and Rehearsal cycles we naturally developed an understanding of how the Data worked in it’s entirety, but we also adopted an ad-hoc almost constant cycle of testing individual Objects (where possible) to further refine and remove the errors we getting.

As I mentioned before, this flexible and agile based approach to Data Migration testing did de-risk the actual Cutover Migration cycle and across all of the Objects we loaded for go-live we achieved a success rate of over 99%.

It is my firm view that the test, test and test some more approach was the catalyst of that success – along with a stable and extremely capable team.

Dylan Jones: That’s a great result. Still on this topic of testing, how did the user or business community ensure that they were getting what was required of the data – how did you structure the User Acceptance Testing and subsequent sign-off for example?

Miles Davies: That’s a really good question. Obviously prior to the Data Migration even starting there was extensive reviews and workshops to ensure that the system configuration and attendant processes would work for the client, especially as there was to be a far greater degree of ESS (Employee Self Service) and MSS (Manager Self Service) due to the richness of the functionality and ease of use that SAP SuccessFactors delivers.

As part of the Mock Conversion process we worked closely with the business community who, due to different levels of understanding around data, were offered a combination of Load reports provided by the Data Migration Team and in-system access to our Test environment.

Once UAT was entered, we were heavily involved in the test results and fixes, especially where data related, which also helped the process along in terms of sign-off. This was helped by the fact that for a period of time I was leading both the Data Migration and UAT Teams for the Programme.

To avoid having to reload data into the clean Production environment prior to go-live (which is not a straightforward delete and replace in SAP SuccessFactors) we actually used a two stage validation approach. The first validation was after the Full Dress Rehearsal where the approach was essentially “this is what you are going to get” which enabled us to make any changes if required. The final validation was post-Load into the Production environment which was a case of “and this is what you have got”.

This may seem quite brutal in terms of message, but we clearly articulated this approach in our early planning and communications and this was accepted.

We also stuck to our message and did it the way we said we would.

Dylan Jones: For one tip you say: ‘If someone asks the question of Data Migration “How hard can it be?” you will need to invest the time to show them’. Can you think of an instance where someone has misunderstood the difficulty of a data migration? What was the education process you adopted?

Miles Davies: It’s hard to think of a specific instance because it happens all the time!

A lot of times this happens because people assume we just ‘lift and shift’ data from one place to the other with little or no interaction in between. I have never experienced this actually happening!

On my current project, we do ‘Lunch ‘n Learn’ sessions where Leads and Heads will talk through their specific area and what they have to do and how to go about it.

As you probably know, when it comes to data migration it is very easy to drop down into the technical weeds very quickly and that will usually cause people’s eyes to glaze over – most people find data, never mind data migration, unbelievably boring.

For my session, I tried to explain in simple layman terms what we actually do. I tend to use a metaphor that most people can either understand or have experienced and that seems to work – the ‘building a house’ metaphor.

Dylan Jones: I like your tip: “Don’t undertake any Data Migration work if there is no time and budget allowed for Data Discovery.” I think clients need to be given this kind of ultimatum to fully understand the severity of proceeding without a deep understanding of their legacy landscape. What does your Data Discovery process look like and what are the deliverables that fall out of it?

Miles Davies: To get to a destination you have to know where you are starting from.

The Data Discovery is an essential element of any Data Migration and should be both technical and non-technical in nature.

You need to understand:

  • What data is required (driven by the configuration and corresponding templates)

  • If it exists (is it migrated or is it created and loaded)

  • Where it resides (if it does indeed exist)

  • How that Data is going to be provided (direct access to databases to create own extracts or provided by an internal or external third party)

Those are the technical elements in broad terms.

The deliverable of this would be your Master Mapping document, probably persisted in Excel, and forms the Data Migration Playbook.

From a non-technical perspective you need to understand:

  • The data quality issues

  • The expectations around the data to migrated in terms of how it looks in the new target environment

  • Who your stakeholders are going to be throughout the whole ETL process

This will give you the contextual layer to your Technical Discovery process and can be persisted as notes or additional tabs in the same Master Mapping/Playbook document.

If you update and socialise regularly (as a Read only!!), you will encourage ongoing engagement in the Migration process. It also gives you a clear view of what needs to be done and is particularly helpful as a Risk Mitigation tool. For example, if you have churn in your Data Migration team, others can come in and get moving with a limited amount of onboarding.

Dylan Jones: With regards to data cleansing and data quality – how have you approached that in your last few migrations? Were there any specific tools or techniques you adopted that are relevant to SAP SuccessFactors migrations?

Miles Davies: You have to be brutal.

There are two parts to any data cleansing and/or enrichment:

  1. What is required to migrate the data into the system (the must have)

  2. What we (the client) would like our data to look like (the nice to have)

All programmes are cost and time constrained. Therefore number 1 above has to be the priority, because if you don’t do the must haves, the number 2 element becomes moot as the data will not be in the new target system. This requires careful stakeholder management and should surface during the Data Discovery.

I would always recommend being clear, transparent and honest.

There is also the simple fact that as a Data Migration team, this is what you are being paid to do.

No-one will thank you for leaving required and crucial data in the legacy system because the client has focussed on nice-to-haves. It is your responsibility to ensure a successful Migration, not theirs.

The great benefit of a system like SAP SuccessFactors is having your own Data Migration environment with maintained configuration – this is an essential ask of the programme to provide. This enables you to go through the ETL process and concisely understand where errors exist and why – this can then point the relevant Data Subject Matter Expert (SME) to resolve, ideally in the source before testing again.

Dylan Jones: Looking at some of your recent projects into SAP SuccessFactors, how have you managed the business transition from legacy to target landscape? Did you adopt a ‘Big-Bang’ migration for example, or was it more of a phased approach?

Miles Davies: They have been phased.

All true transformation programmes require a lot of investment from the programme client. This investment is not just financial, but also technical, resource and emotional – change is coming, and that may or may not be a positive thing at an individual level.

The decision to phase was right as it enables the client to absorb change at a sensible pace while giving the programme the opportunity to pause, reflect, conduct lessons learned and improve the deployment process for the next phase.

That being said, when you are working late into the night, it sometimes feels like a Big Bang!

Dylan Jones: Still on this topic of phasing – how did you structure the phases? For example, did you migrate by a range of objects or some other ‘Unit of Migration’ (UoM)?

Miles Davies: The phasing was determined by the SAP SuccessFactors modules that were being deployed.

The first phase for Employee Profile (lite record), Talent & Performance and LMS.

Phase 2 was Employee Central and Compensation.

We did a full migration, including test cycles for each phase, which in turn had its own schedule and Systems Integrator (SI) Project Manager.

It was phased over 2 years.

Dylan Jones: Totally agree with your point about leadership. Any advice for our members on how to tackle the problem that so many organisations find themselves in when they simply use whatever project management resource they have on staff, regardless of their data migration experience?

Miles Davies: To be honest it is really simple – don’t do it!

Data Migration really isn’t as simple as it sounds. Moreover, nobody other than Data Migration experts actually want to do it.

I have many conversations with Systems Integrators (SIs) who assume, or tell the client, they are responsible for it, and many companies who assume the responsibility with no real understanding or internal capability to do it either.

It is a thankless task – get it right and no-one says a word, get it wrong and you will be centre of attention for all the wrong reasons!

Additionally, and to use an engineering analogy – this isn’t the shiny, sexy stuff, this is the oily bits – and unless you love it, you don’t want to do it; if you don’t want to do it, but are made to, you will not do it well.

Let someone who loves it and ‘gets it’ do it – at the very least from a leadership perspective.

Dylan Jones: I’ve found that many organisations struggle right out of the gate with the project initiation phase. I think this is where solid, experienced leadership, comes into its own because you need someone to set up the project environment correctly. Are there any early-phase initiation tasks you recommend, particularly for a SuccessFactors migration?

Miles Davies: Get the basic principles agreed around things like:

  1. How much historical data to migrate (as little as possible)

  2. What to do with the legacy data left behind

  3. Who are your Data SME’s and Business Stakeholders

  4. Get the Data Discovery committed to

  5. Know your ETL platform approach

  6. Know your resource budget (and factor for more..!)

  7. Get a working DM strategy written and socialised

Most importantly of all, your first hire should be the Data Migration Lead and if they are good enough, all of the above will fall into place very quickly. If they are not, you will know within a month, so fix quickly and recover the time.

I have been parachuted in a number of times and had to work to very tight timelines because the client left it too late.

Dylan Jones: Finally, and with the benefit of 20/20 hindsight on your recent projects – is there one final piece of advice you can share for clients about to embark on a particularly complex data migration, be that SuccessFactors or otherwise? Is there something you would do differently next time around for example?

Miles Davies: Now that is the $64m question..! For me I would encourage clients to not treat the Data Migration as a peripheral stream of activity. Move it more towards the centre of the strategic centre of the Programme plan, and invest in the time and resources to get it right first time.

After all, get the Data Migration right, and you have the key element of all business – your Data – delivering in the solution or solutions you have invested so much time, effort and money in. It can also be the catalyst for the onward development of your data through Governance, Quality, Analytics and more to turn your Data into an asset that can be monetised.

Get it wrong, and you will have spent a lot of money for a ‘shiny new box’ that just doesn’t deliver on your operational expectations.

Previous
Previous

What is Landscape Analysis and why is it important to your data migration?

Next
Next

What are the responsibilities of a data migration sponsor?