Compliance finally gives data quality the platform it deserves

0

Regulatory compliance is a principal driver for data quality and data governance initiatives in many organisations right now, particularly in the banking sector.

It is interesting to observe how many financial institutions immediately demand longer timeframes to help get their 'house in order' in preparation for each directive.

To the outsider, it may appear disturbing why modern, data-driven organisations, would struggle to supply accurate data or a robust working knowledge of their internal processes.

The reality, of course, is that many financial institutions are fighting a battle with data, every day. They are trying to cope with internal and external demands for faster, smarter services, while still maintaining an often outdated, silo-based, IT infrastructure.

For this reason, IT banking leaders often view regulatory compliance in a negative light. Just one of the many corporate chores that need be satisfied to remain in business or avoid punitive damages. They approach compliance as a series of isolated initiatives. They build focus teams that remain tactical in nature. The focus is often on 'cobbling together' a solution to satisfy the needs of a particular directive, good enough to tick the right boxes.

But innovative data quality practitioners at the leading edge of regulatory conformity see things far more positively.

Experienced data quality practitioners recognise that regulatory pressure not only gives them an executive mandate, but the ability to create a platform and framework that will help satisfy or at least accelerate the deliverables for future compliance directives.

A good example of this is the data quality requirements of a directive such as BCBS239. Principles 3-9 of this particular regulation could come straight from any data quality textbook. The creators understood that data quality starts with the leadership, in fact they spelt it out in the first principle:

A bank’s board and senior management should promote the identification, assessment and management of data quality risks as part of its overall risk management framework.

What is interesting about this regulation is the obvious need for a total approach to data quality. You can't manage the quality of aggregated data if you're not able to control the quality of all underlying data, at the lowest atomic level. (For more on how to manage data quality, download the TDWI paper, Data Quality Challenges and Priorities.)

Another interesting shift is the demand by regulators to assess not only the data supplied by firms but also the working of their internal data management process. Once again, this confirms the need for deploying a far more holistic, platform-based, approach to data quality management in order to demonstrate the commitment regulators want to see.

You may not operate in a regulated industry, but irrespective of your sector or business model, this new breed of directive represents a straightforward lesson in common sense that every organisation should study closely. If you decipher their meaning, you'll soon realise these directives, and the manner in which they are governed, offer a blueprint for your data quality vision.

Share

About Author

Dylan Jones

Founder, Data Quality Pro and Data Migration Pro

Dylan Jones is the founder of Data Quality Pro and Data Migration Pro, popular online communities that provide a range of practical resources and support to their respective professions. Dylan has an extensive information management background and is a prolific publisher of expert articles and tutorials on all manner of data related initiatives.

Leave A Reply

Back to Top