Want to improve data quality accuracy? Look for a techno-cultural shift

0

In a former life I would often visit sites with literally thousands of plant assets. Each asset was critical to the business not just in terms of capital costs but in the services it provided – and the risks that were caused due to poor maintenance or failure.

What was interesting (and troubling) in these visits was just how bad the data quality accuracy would often be.

For those new to data quality, accuracy is widely defined as a measure of how well a piece of information in a data store relates to its real life counterpart.

On the ground, we would often find entire rows of equipment either missing or located in another section of the building. Many of these issues often only come to life during audits, engineering work or when there are catastrophic events such as fires or explosions that destroy the site. Engineers then find that a good amount of the services and infrastructure they have on record is either missing, incorrectly placed or configured in an entirely different way to their "systems of record."

What can be done in this situation to improve data quality?

In my situation I found that it required change management around the culture combined with a range of technology interventions.

Speaking to engineers it was clear that a lot of the processes they followed were complex and time consuming. As a result they simply didn’t get the time to enter accurate information. There were often multiple silos of systems where repetitive information had to be added.

The first approach to recommend is an architectural one. Better master data management, or MDM, is required to simplify the data collection process. We also found that the quicker the engineers could get the information entered then the more accurate it became, so having online access to operational systems was critical. There are no excuses with the modern "tough build tablets" and other data entry tools that are increasingly common in remote locations.

Once the process is simplified, you can start to engage the engineers in the importance of data quality and how it ultimately benefits them in the field and helps them meet their targets. I’ve interviewed companies who have begun to measure data quality at the engineer, operations team and department level so they can reward the high achievers of accuracy.

Finally, you can leverage data quality tools to find likely hotspots of poor quality data. Data quality tools help you find the gaps between silos of disparate data that are often cobbled together to form a convoluted engineering process (or equivalent in your industry). By identifying anomalies between systems in terms of accuracy, completeness, consistency and synchronisation, you can start to prioritise site locations for improvement. These tasks can be incorporated into future maintenance schedules to reduce the need for ad-hoc audits that can obviously be costly.

What do you think? How are you tackling data accuracy in your datasets? Welcome your views.

Share

About Author

Dylan Jones

Founder, Data Quality Pro and Data Migration Pro

Dylan Jones is the founder of Data Quality Pro and Data Migration Pro, popular online communities that provide a range of practical resources and support to their respective professions. Dylan has an extensive information management background and is a prolific publisher of expert articles and tutorials on all manner of data related initiatives.

Leave A Reply

Back to Top