Making information accuracy BAU


According to surveys within the IAIDQ group on LinkedIn, most data quality practitioners consider accuracy to be the most important data quality dimension.

This makes sense, of course, but because true accuracy is often difficult to measure (unless we have a verifiable reference source), accuracy is often, ironically, one of the dimensions that we measure far less than completeness or consistency.

Ignoring accuracy is a risky business for data quality professionals as I learned the hard way when undertaking a data quality assessment of plant equipment almost a decade ago.

We had performed a rigorous data profiling and data discovery exercise to uncover all the necessary data quality rules present in the data. Workshops had been held with business users, original designers and other application stakeholders to exhaustively document all the data and business rules in the system. Our data quality assessment measured hundreds of rules and gave us an accurate view of data quality health.

Or so we thought.

In a show of thoroughness we arranged a site visit to compare the accuracy of equipment placement and labeling information in the system with the physical position on the ground.

The results were startling. Well over 30% of equipment just wasn’t where it should be. Power units were missing. Ducts and risers didn’t exist. Where old equipment was recorded in the system, shiny new technology now stood.

It was a stark lesson in holistic data quality management and also a wake-up call to the true cost of bad data. It had taken us hours to locate a handful of pieces of equipment. Imagine this kind of inaccuracy on a national scale.

So, a sobering lesson for all would-be data quality practitioners and leaders. Accuracy is still king of the data quality world and without it there will always be revenue leakage.

But how to fix data quality accuracy issues? Well, this really depends on the type of data and context of its usage.

For example, improving the accuracy of customer data is often far easier because there are often multiple reference sources to cross-check against. But of course that doesn’t stop companies occasionally calling customers to verify contact information.

The same couldn’t be said for some systems that may contain entities such as engineering assets, which can be in the ground for decades, even centuries. Validating the accuracy of these items may require an engineer onsite, physically cross-checking their presence.

The trick is to weave accuracy verification into your operational activities. Whenever a maintenance crew has to repair a fault in a location, ask them to cross-check equipment in their vicinity. It’s far more cost-effective than sending them out on data quality fact-finding missions.

The key takeaway is that you must not consider data quality management as some separate activity that depends on mysterious technology and practitioners with letters after their name. Accuracy should instead be a part of business-as-usual activities. All you need to do now is figure out how to motivate your teams to go about doing it diligently!


About Author

Dylan Jones

Founder, Data Quality Pro and Data Migration Pro

Dylan Jones is the founder of Data Quality Pro and Data Migration Pro, popular online communities that provide a range of practical resources and support to their respective professions. Dylan has an extensive information management background and is a prolific publisher of expert articles and tutorials on all manner of data related initiatives.


  1. Dylan Jones
    Dylan Jones on

    Thanks Adrian, it's amazing how much effort goes into everything else BUT accuracy, good to know others feel the same way.

Leave A Reply

Back to Top