Using a master index and shared metadata as an entity system of record

0

A master data repository can be created using the data combined from the variety of data sources. A unified, consolidated representative record can be created for each identified entity and presented by selecting the union of all source data attributes into a carefully-crafted master data model.

In the best of all possible worlds, the master data model can sufficiently satisfy the union of all downstream data usability, synchronization, and quality requirements. Yet as the source data volumes as well as source data disparity grow, preemptive application of business rules may mask subtle variations that remain critical to downstream consumers.

That suggests two key aspects to the master data management repository that will allow it to continue to evolve in support of a growing downstream user base while remaining consistent with a governed enterprise canonical model. The first is using the master index itself as the key data asset that holds the right number of data attributes shared across multiple consuming applications in a way that is semantically consistent. The second is that while the data values in the different sources can be employed by the identity resolution components to establish the links for a specific entity type as well as establish relationships across entities (within and outside of the same entity type), it is not always necessary to collect all the data attributes until it is clear what is needed by the consuming application.

Aligning the master data model with the enterprise canonical model and the definitions agreed to and managed within the enterprise business glossary and metadata repository will ensure consistency for managed shared master data at the identity level and the semantic level. As we will see next, this will enable enhanced services for both federation and virtualization.

Tags
Share

About Author

David Loshin

President, Knowledge Integrity, Inc.

David Loshin, president of Knowledge Integrity, Inc., is a recognized thought leader and expert consultant in the areas of data quality, master data management and business intelligence. David is a prolific author regarding data management best practices, via the expert channel at b-eye-network.com and numerous books, white papers, and web seminars on a variety of data management best practices. His book, Business Intelligence: The Savvy Manager’s Guide (June 2003) has been hailed as a resource allowing readers to “gain an understanding of business intelligence, business management disciplines, data warehousing and how all of the pieces work together.” His book, Master Data Management, has been endorsed by data management industry leaders, and his valuable MDM insights can be reviewed at mdmbook.com . David is also the author of The Practitioner’s Guide to Data Quality Improvement. He can be reached at loshin@knowledge-integrity.com.

Leave A Reply

Back to Top