Why poor data quality is a business certainty

0

Although I do far less on-the-ground consulting than I used to, I still occasionally get to sample the delights of walking into an organisation, performing a data quality assessment and watching the occasional jaw hit the table when the client sees the actual state of their data.

Discovery was always the fun part for me. I’ve never really enjoyed the ensuing improvement process that can often take months or even years to put right.

Visionary leaders who want to get started with data quality management invariably want a hard-hitting horror story that can have billing managers waking in a cold sweat at three in the morning after they relive the moment they discovered hundreds of expired contracts still being paid out every month.

Part of the challenge is knowing where to start, and my preference was to focus on age. Find the oldest, most patched together, legacy behemoth within the company and get to work, unraveling the data and its terrible tale of defects and business woe.

The irony is that at one point someone switched this system on and proclaimed it the very pinnacle of client server engineering. This is why poor data quality is a business certainty. Businesses will always change at a far more rapid pace than application designers can support, therefore guaranteeing that at some point the gap between system design and business need will grow so wide that data errors are a statistical certainty.

Last week I talked about data overloading (where users stuff multiple facts in a single attribute because the system can’t cope). Overloading is a good indication that your system is starting to drift from reality. With more and more companies choosing to buy third party or COTS products, overloading is no doubt set to rise.

We’re also seeing a shift from less dependence on big, internal IT teams to leaner, more agile outsourced or “DIY” apps. This will only add to the reality gap that will undoubtedly occur when the business starts to pivot.

I run a tiny business and I’ve had to pivot the core model two or three times in five years. Just imagine how flexible and agile a mid-sized company needs to be in today's age of austerity, web 2.0 and aggressive competition. Their systems can’t cope with the speed of change and, as a result, the gap and stress that places on the underlying application design and inevitably the data will guarantee data quality defects are an absolute certainty.

The key, of course, is to accept that this is a fact of modern business life; systems have a shelf-life that is getting shorter and shorter. The missing piece in this discussion is that data quality management can extend this lifespan, often considerably, and at far less cost than scrapping and reworking a completely new system. But oddly enough it will be the last thing on people's minds.

Because new systems have perfect data, don't they?

Tags Business
Share

About Author

Dylan Jones

Founder, Data Quality Pro and Data Migration Pro

Dylan Jones is the founder of Data Quality Pro and Data Migration Pro, popular online communities that provide a range of practical resources and support to their respective professions. Dylan has an extensive information management background and is a prolific publisher of expert articles and tutorials on all manner of data related initiatives.

Leave A Reply

Back to Top