How to create a more focused data quality story

0

I’m continuing my series on practical techniques that anyone can use to leverage data quality for bottom-line gains.

Last week I talked about the link between data quality and service consistency, giving you some tips on how to improve both. This week I want to talk about a simple technique I often use in companies that have logistical processes or any kind of multi-point transactions through which various data flows (this accounts for most organisations in some form or other).

It can often feel a little daunting when starting out on a mission to improve data quality. The biggest challenge is knowing where to make an impact that will really benefit business users and, of course, customers. We saw last week how a simple focus shift on your data entry workers combined with some simple data structure tweaks can make all the difference.

To help me establish a starting position, I often start from this point:

What is the most important process in my area under investigation?

Obviously the importance of a process is dictated by the type of organisation and scope of your involvement. Perhaps you’re looking to improve compliance in a large insurance firm. Maybe you’re chief of patient data in a large healthcare organisation.

Whatever your role and remit, I recommend starting with one process that is deemed critical.

Next, start from the very first touch points of that process. How does data get created? What are the gateways into the process? Which systems are involved? What do people call this initial function? Who is responsible for it?

I typically use a white board or even an entire wall (glass walls are great for this) and start labelling different colour Post-Its based on the different pieces of information I need.

I then establish the next junction point in this process, and the next, and the next, until I reach some form of natural termination point. Of course you’ll start to see branches forming (e.g. data going off to management reports or external feeds), but focus on the core business function or process under investigation.

After I’ve done this, I try and follow a transaction through this process. The feasibility of this depends on the length of the transaction, obviously. If it’s the delivery of a parcel it can be over in a few days, but if it’s a construction process then obviously this can take much longer.

What you’re trying to establish is the lifecycle of a transaction through this process. You need to get intimate with processes before you can really start to improve the quality of the underlying data.

You start to get a feel for where the staff training at different nodes in the process is poor. You identify batch jobs and nightly feeds that are prone to error. You locate field workers who struggle to enter correct information due to inadequte field tools or simply lack of time.

Armed with this prior knowledge, you then start to profile and assess the data – but with a more holistic approach in mind. You will have some anecdotal evidence of where to prioritise your focus.

What you end up with is a much more compelling story of where data quality is impacting the organisation, because it directly relates to a business process that people care deeply about. By adding people's names and stories to your presentations, it brings the whole journey of those transactions to life. You can now explain to senior management exactly why the delivery lead times are falling or the administrative costs are going up.

So the moral of the story is that to become more effective at data quality improvement, you need to become a better storyteller. The approach outlined in this article is how I typically combine data, process, quality and story. It has served me well in the past, but perhaps you have an alternative? I would love to hear your approach in the comments below.

Tags
Share

About Author

Dylan Jones

Founder, Data Quality Pro and Data Migration Pro

Dylan Jones is the founder of Data Quality Pro and Data Migration Pro, popular online communities that provide a range of practical resources and support to their respective professions. Dylan has an extensive information management background and is a prolific publisher of expert articles and tutorials on all manner of data related initiatives.

Leave A Reply

Back to Top