How to observe the impact of modernisation through a data quality lens

0

At some point, your business or IT leaders will decide – enough is enough; we can't live with the performance, functionality or cost of the current application landscape.

Discussing data quality and modernizationPerhaps your financial services employer wants to offer mobile services, but building modern apps via the old mainframe architecture is impractical and a replacement is needed. Or maybe your existing silo-based architecture has become too costly to maintain and a lower cost alternative is required.

Whatever your modernisation needs, I can guarantee that one of the business priorities will be to understand how functionality can be improved. Cue new target systems and development partners in endless presentations of their "forward-looking" creations.

Along with speed and cost improvements, you soon become dazzled by the one-touch automation functionality and big-data-ready architectures offered by the target vendor.

In one project, I witnessed an organisation spend £10,000,000+ on a new target system that seemed to have every conceivable function their future business desired. The only problem was, they forgot to consider:

  • Can the legacy data be migrated to the target system in its present state?
  • Will the legacy data support the target functionality once migrated?

Improving the feature set of existing systems is a mainstay of legacy modernization, but it is the question of "feature uplift" that causes so many problems (particularly during migration and go-live). That's because people forget that function and data are intricately linked.

Data, in its simplest form, is an inanimate array of bits and bytes. It stays like this until an application feature comes along and does something useful with it.

Likewise, those latest and greatest features offered by the target vendor will only function if they can use the right data, with the right qualities – such as validity, accuracy, completeness, consistency and integrity.

Unfortunately, far too many companies forget to consider the state of their legacy data before they scope out their modernisation plans.

What should you do to ensure your modernisation plans are not scuppered?

First, step away from the technology and think conceptually about how your business will change in the next five to ten years.

  • How will your value proposition evolve?
  • What new customer channels and partners will you need?
  • What variety of cost structures and activities will you need to implement?

Next, select some of the key functions that you expect your target business model to support – but don't boil the ocean. Perhaps look at how your interaction with partners or customers is likely to adapt. Just focus on one area of interest for now.

Next, look forward and consider your conceptual data model of the future.

Wherever there is a change of use in function, there is always an increased demand on data quality.

So where are the gaps? How will your existing data assets cope with the new functional and conceptual demands placed upon it?

For example, if you are a utility, perhaps you want to forge a partnership with white goods manufacturers to offer energy consumers special deals on things like dishwashers and tumble dryers based on the results of their energy usage. To make this new business model a reality, you need good quality data on:

  • Personalisation (e.g., What are a customer's social media presence and marketing preference? What does a "marketable household" look like from a data quality perspective?)
  • Location (e.g., Do you have accurate address data? Does the electrical performance match up to the right owner? Can the performance data easily be transferred if they move to a new property?)
  • Historical data (e.g., can you go back in time and predict electricity usage accurately? How many customers are affected by gaps in the data?)

You will observe that even with the millions of records and attributes in a modern business landscape, modernisation programmes can be stalled by the lack of data quality across just a handful of conceptual data entities.

For example, a gas utility found that one of its systems for tracking metering data was over 30 percent incomplete. That was enough to prevent a system transformation programme until the data could be collected during twelve months of field visits.

Once you understand the future function landscape and how that will impact your legacy data, you can start to discover where the gaps exist and the viability, cost and timeline of any remedial solutions.

The end result is a more realistic expectation of how (and where) your business model and application landscape can be modernised, at least with a greater degree of accuracy.


Download a TDWI checklist report on modernization.

Share

About Author

Dylan Jones

Founder, Data Quality Pro and Data Migration Pro

Dylan Jones is the founder of Data Quality Pro and Data Migration Pro, popular online communities that provide a range of practical resources and support to their respective professions. Dylan has an extensive information management background and is a prolific publisher of expert articles and tutorials on all manner of data related initiatives.

Related Posts

Leave A Reply

Back to Top