Ideas for justifying your data quality existence

0

Conference season is hotting up in the UK, and there are no doubt lots of practitioners putting the finishing touches to their data quality presentations.

One interesting observation I’ve encountered is a high churn rate amongst data quality professionals, particularly within the leadership community.

Their decision to quit is not always voluntary.

Many data quality presenters focus on the tactics of delivering a data quality project, but fail to explain what the organisation gained in terms of tangible business benefits. Whilst the presenter may just be aiming to impart their knowledge to a captive audience, I suspect a lack of robust performance measurement is often at fault.

The problem is that when you stop focusing on the data quality benefits that matter to senior managers you become vulnerable. When fortunes change, executives can make short-term decisions that don’t always benefit corporate quality ambitions.

So how can you survive these turbulent times and justify the existence of your data quality aspirations over the long-term?

I believe the key is to measure business impact from the very start of your initiative. You need to gather not only metrics such as increased revenue and reduced costs, but a whole rack of other measures specific to each business function you are impacting.

Let’s imagine you head up the data quality team for a large utility organisation. You’ve been focusing on improving the data quality of plant equipment. With all the activities to coordinate you may decide to look at operational performance at the end of a twelve-month cycle.

The problem with this 'annual review' style approach is that you cannot distinguish your impact from the many other influences on performance. These could be as a result of seasonal demands, corporate staff cuts, new plant installations and so on.

By identifying key operational metrics at the start of your initiative and tracking these on a daily or weekly basis, it becomes far easier to assess the real-time impact your work is having.

For example, in our small business we've been improving member data on Data Quality Pro. The result is that we've seen an increase in terms of traffic and engagement from our newsletter. If we waited until the end of the year to measure the impact, it would be far more difficult to separate the quality improvement work from the many other events that impact our traffic and engagement.

How else does this benefit your data quality aspirations?

By recording these small wins, you can build up a catalogue of data quality impact stories. These demonstrate the day-to-day impacts that your team are delivering and are priceless for convincing management to not only retain your team but develop it further.

One final tip. By measuring performance impact on a weekly basis, you can start to create a repository for gathering the relevant operational performance data and of course your data quality metrics too. You need to automate these processes to save time and workload so your data quality software can play a part here. Modern data quality tools enable data integration and of course validation comes built-in. You can capture performance data on a routine basis from an array of different data stores and create a unified view on just how beneficial your team has become.

Hope this has given you some practical ideas. If you have any questions on how to make it happen, drop a note in the comments section below.

Share

About Author

Dylan Jones

Founder, Data Quality Pro and Data Migration Pro

Dylan Jones is the founder of Data Quality Pro and Data Migration Pro, popular online communities that provide a range of practical resources and support to their respective professions. Dylan has an extensive information management background and is a prolific publisher of expert articles and tutorials on all manner of data related initiatives.

Leave A Reply

Back to Top