Making accuracy auditable

0

Over the past set of posts we have looked at framing the definition of accuracy as consistent with an agreed-to system of record qualified by defined business rules. The last question involves ensuring that there are no points during any of the end-to-end information flows in which data flaws are introduced that violate any of the quality rules (both for the system of record and consistency rules for the data used for analysis).

Auditability of data accuracy must rely on instituting inspection and monitoring across the critical paths of the information flows. This is often done using defined rules implemented within a data profiling tool. In that case, any time a business process potentially modifies the system of record, a set of rules can be applied and a business rule compliance score can be generated. Practically, though, you might not apply the scoring process every time the data possibly changes – it may be more feasible to do it over a reasonable period (such as once every eight hours). If the score meets the levels of acceptability for the business rules, the system of record is trustworthy.

Similarly, prior to using data for analysis, the accuracy can be measured in relation to consistency with the system of record. Again, if the measured score meets the level of acceptability, the analyzed data is deemed to be trustworthy. Most importantly, each time the validation process is run you have a historical track record of the accuracy scores at different points in time. Any challenge to the accuracy of a report can be met (and easily dismissed) by showing the historical track record that demonstrates the data meets the agreed-to expectations.

Tags
Share

About Author

David Loshin

President, Knowledge Integrity, Inc.

David Loshin, president of Knowledge Integrity, Inc., is a recognized thought leader and expert consultant in the areas of data quality, master data management and business intelligence. David is a prolific author regarding data management best practices, via the expert channel at b-eye-network.com and numerous books, white papers, and web seminars on a variety of data management best practices. His book, Business Intelligence: The Savvy Manager’s Guide (June 2003) has been hailed as a resource allowing readers to “gain an understanding of business intelligence, business management disciplines, data warehousing and how all of the pieces work together.” His book, Master Data Management, has been endorsed by data management industry leaders, and his valuable MDM insights can be reviewed at mdmbook.com . David is also the author of The Practitioner’s Guide to Data Quality Improvement. He can be reached at loshin@knowledge-integrity.com.

Leave A Reply

Back to Top