How to improve data quality through more effective data maintenance monitoring

0

Some of the most common reasons for defective data come down to a simple lack of coordination between the various groups involved with preserving the integrity of data logic, content, structure and assurance.

By improving this process of coordination between the different silos, you can dramatically improve the quality and governance of your data over the long term. (What’s more, it’s often a fix that won’t break the bank).

Let’s consider an operational billing capability in a typical utilities organisation. There are literally millions of lines of code, thousands of functions, hundreds of entities and scores of systems involved in ensuring that customers are given accurate bills each month.

With this huge amount of complexity it’s little wonder why utilities firms often fall foul of the regulator when billing mistakes are created.

In one organisation with a mature data quality strategy, I observed how they were trapping and responding to these issues. They created a "vital signs" monitoring system of the key information chains in their business. The vital signs monitored included data quality metrics, but a range of other metrics such as process lead time, customer complaints, system coding updates and other stats were not always considered with data quality projects.

The idea was to create a kind of "mission centre" for data quality where all potential failure points were routinely monitored so that anything out of the norm could be spotted and resolved.

I was shown an issue where a data quality defect resulted in longer lead times than normal and an increase in parts not being returned for re-use. But what was interesting was the defect originated after a system coding update. They had all the information on-hand to make that link.

Links are what you need to create a more effective data maintenance process. You need an overarching process to connect the dots between the different data lifecycle points. There needs to be as much attention focused on the data as the logic. They are both interdependent, after all.

I think the data quality or data governance function is an ideal place to implement a data maintenance monitoring function because it allows you to develop a more holistic view of the processes impacting your data. It just needs data quality leaders to think outside the box a little and consider some data maintenance activities they wouldn’t necessarily monitor.

How about your data maintenance processes – are they effectively monitored and governed? Do you think monitoring them via a data quality process makes sense? Please share your views below.

Share

About Author

Dylan Jones

Founder, Data Quality Pro and Data Migration Pro

Dylan Jones is the founder of Data Quality Pro and Data Migration Pro, popular online communities that provide a range of practical resources and support to their respective professions. Dylan has an extensive information management background and is a prolific publisher of expert articles and tutorials on all manner of data related initiatives.

Leave A Reply

Back to Top