One of the significant problems data quality leaders face is changing people's perception of data quality. For example, one common misconception is that data quality represents just another data processing activity.
If you have a data warehouse, you will almost certainly have some form of data processing in the form of 'data cleanup' activity. This cleanup work may involve transforming rogue values and preparing the data for upload into the live warehouse and reporting environment.
In a CRM migration, your data processing may include a stage to deduplicate customer records prior to go-live.
It's easy to see how people view these activities as data quality because they are improving the quality of the final information product.
One of my frustrations is when leaders see these data processing activities as a 'data quality strategy'. The reality is that all repetitive transformation of data is tactical. There has to be a longer term, strategic view, which focuses on tackling the underlying causes of poor data quality.
If you're not moving forward and understanding what caused the data to become defective, you will end up trapped in a kind of 'Sisyphus cycle', endlessly spending time and effort on the same task.
In a typical information chain, such as an ETL process, there are many repetitive tasks that are perfectly valid. However, just because your ETL tool can perform data quality processing doesn't mean it should carry this out indefinitely.
I believe this is where the data quality leadership has to take a stand. There needs to be a clear policy that outlines what data processing is permissible and what falls into the realm of data quality. By embedding data quality processing into a plethora of different applications, ETL tools and custom scripts, you are creating a situation where:
- Ownership can't be assigned
- Root-causes are never resolved
- Data quality rules are not controlled
- Monitoring and governance are impossible
Data quality leaders need to look closely at these repetitive data quality processing activities and demonstrate the business case for using a range of measures to resolve them permanently. Ultimately, poor quality data stems from a broken process that incurs repetitive waste. The role of the data quality team is to help the organisation devise better processes for root-cause defect prevention, not just create slicker ways to continually process data.
When organisations move away from tactical data quality processing activities, they are demonstrating a growth in maturity. This growth is critical for becoming a world-class organisation.
Remember that your approach to data quality can never stand still. There is always another step along the maturity curve to take. Each stage must address improvements in order for the organisation to become smarter, faster, leaner and more efficient. The key is demonstrating the value of moving from each stage to the next.
1 Comment
Precisely explained in brief.