Master data management – models, time and synchronization

0

I thought it might be worth taking a short break from discussing metadata and instead cycle back to an idea that has been challenging me recently in a few of our current MDM consulting engagements. I have been examining patterns for master data use, and one of the common recurring themes is the pattern in which a master index is used to establish a system of reference among enterprise systems of record containing data for a master data entity.

For example, consider a master domain for vendor data. In this case the master data model for an index/registry may contain some number of data attributes that are either necessary for unique identification or have been selected as common data attributes that are shared frequently among business application consumers. There may be areas of the business that manage sources of record such as the Credit business function, the Finance and Accounting business function, and the Fulfillment business function.

The development of a model for the master index/registry generally follows a typical thought process: define the entity concept, determine the attributes to be managed as part of the entity concept, specify a set of rules for data accumulation and identity resolution, then arrange for some component of a tool suite to populate the model. Many systems are configured this way, and are designed to ingest data sets extracted from originating systems of record on a periodic basis, process the incoming records and determine if they influence or change the inventory of identified master entities, and then publish the results to the consumer community.

However, I have heard reports that users of the master entity data raise questions about the quality of the master data – its completeness, accuracy, but more often than not, its consistency. What they mean by consistency involves alignment or harmonization with the entity’s data in one or more system of record.

After reviewing some of the processes and talking with various folks who are involved in the process, it occurred to me that our conception of the master model and our processes for populating the data elements in that model may be biased by our ignorance of transaction synchronization. In my next posts I will discuss the concept of transaction synchronization and share some thoughts about the reasons why transaction synchronization must be preserved as part of the MDM process…

Share

About Author

David Loshin

President, Knowledge Integrity, Inc.

David Loshin, president of Knowledge Integrity, Inc., is a recognized thought leader and expert consultant in the areas of data quality, master data management and business intelligence. David is a prolific author regarding data management best practices, via the expert channel at b-eye-network.com and numerous books, white papers, and web seminars on a variety of data management best practices. His book, Business Intelligence: The Savvy Manager’s Guide (June 2003) has been hailed as a resource allowing readers to “gain an understanding of business intelligence, business management disciplines, data warehousing and how all of the pieces work together.” His book, Master Data Management, has been endorsed by data management industry leaders, and his valuable MDM insights can be reviewed at mdmbook.com . David is also the author of The Practitioner’s Guide to Data Quality Improvement. He can be reached at loshin@knowledge-integrity.com.

Leave A Reply

Back to Top