The many costs of poor data quality in modernization programs

0

At some point your organization will decide the time is right to modernize. While much of the talk will be about the challenges of modernizing service capabilities and systems, the real challenge will come from modernizing your data.

It’s often assumed that because data was fit for purpose in the legacy world it will behave itself in the target world. If you've been reading this blog for any period of time you’ll know that assumptions like this are the staple of data migration train wrecks.

The typical cost cited of poor data quality in relation to modernization programs is often the cost of any additional staffing and technology required to bring the data up to a quality level that can support the new target environment.

Of course, we obviously have to factor in another cost, which are the delays due to unforeseen improvements and cleansing. Most system modernization programs are sold on the basis that they will drive bottom-line gains to the business or provide some form of competitive advantage. If you've ever been two years into a six month data migration you’ll know just how much of an impact to the wider program bad data can cause. The benefits that were touted get delayed, and the longer it takes to get the system live, the longer your program costs stack up – and the longer it takes the new target system to start generating value for money.

Finally, there is an additional cost that is always overlooked because it is not as easy to calculate or articulate, yet it has always been present in the large-scale modernization programs I've witnessed. It’s the cost of risk that suppliers push back on the client when data quality is an unknown.

This issue of risk reversal was brought home after the UK government recently received criticism from MP’s due to a lack of focus on data quality (see story here) in  a major modernization initiative. A project cost £1,000,000,000, and the client (the M.O.D) was reportedly lacking in knowledge about the quality of their inventory data.

When suppliers are faced with such huge gaps in knowledge they must invariably manage the risk. One wonders what the bill would have been if:

  1. The government had comprehensive details of their data quality deficiencies so that accurate costings for improvement could be made
  2. There had been improvements in legacy data over time that ensured data quality was fit for current and future purposes

All organisations need to modernize IT solutions at some point; it’s a natural, organic process. Ignoring data quality simply adds to the cost, in more ways than most organisations realize.

Tags
Share

About Author

Dylan Jones

Founder, Data Quality Pro and Data Migration Pro

Dylan Jones is the founder of Data Quality Pro and Data Migration Pro, popular online communities that provide a range of practical resources and support to their respective professions. Dylan has an extensive information management background and is a prolific publisher of expert articles and tutorials on all manner of data related initiatives.

Leave A Reply

Back to Top