The master entity model’s need for synchrony

0

Transaction systems that feed master data repositories ensure a degree of synchronization that ensures proper transaction execution. The transaction processing system generally only looks at a few records at one time, and updates are committed so that they do not interfere with other transactions. Transaction within each subsystem are isolated from other transaction processing subsystems, and that guarantees freedom from interference.

Master data management entity models are designed with the presumption that updates or modifications are also protected from interference. However, if the inputs are not configured or presented in the same sequence as the original transaction systems, there is a risk of inaccurate, inconsistent, or incomplete data being committed to the master index registry. And when multiple transaction system extracts are processed by the MDM system in a single batch, you pretty much ensure that the original synchronization orders have been violated, resulting in a greater chance of violating data fidelity expectations.

To address this challenge, MDM architects must adjust the way they think about master data ingestion and master index integration. First, recognize that the static representation in a master model considered to be an “enterprise” asset is inconsistent with the fluid, “dynamic” progression of the data lifecycle from transaction systems. Some attempt must be made to assess the data lifecycle events across the application systems from which master data is sourced – and determine where synchronization dependencies must be maintained prior to updating the master entity registry.

Second, use that knowledge of the data lifecycle to differentiate between those master data attributes that are inherently “pseudo-static” and those that are dynamic. If there are risks of out-of-order updates, segregate the data sets that are inputs to the MDM system, either by process phase or by timestamp. You can even attempt to preprocess the inputs and reorder in a way that is consistent with the original sequence. Or, you can integrate updates to the master registry directly within the transaction systems, which will ensure synchronization.

Seeking to eliminate a source of data flaws introduced through the batch data ingestion process for MDM will lead to increased trust in the fidelity of the master data asset.

Share

About Author

David Loshin

President, Knowledge Integrity, Inc.

David Loshin, president of Knowledge Integrity, Inc., is a recognized thought leader and expert consultant in the areas of data quality, master data management and business intelligence. David is a prolific author regarding data management best practices, via the expert channel at b-eye-network.com and numerous books, white papers, and web seminars on a variety of data management best practices. His book, Business Intelligence: The Savvy Manager’s Guide (June 2003) has been hailed as a resource allowing readers to “gain an understanding of business intelligence, business management disciplines, data warehousing and how all of the pieces work together.” His book, Master Data Management, has been endorsed by data management industry leaders, and his valuable MDM insights can be reviewed at mdmbook.com . David is also the author of The Practitioner’s Guide to Data Quality Improvement. He can be reached at loshin@knowledge-integrity.com.

Leave A Reply

Back to Top