Master data synchronization and eventual consistency

0

Periodic synchronization of your master data environment presumes batching up new entries to be processed all at once. Full synchronization means that any new entity brought into one of the enterprise systems will immediately be added to the master index. There are benefits and drawbacks to both of these approaches, but to determine which one is right for your organization, you have to ask questions about the degree of interaction and dependency across the application landscape, such as:

  • How frequently are new entities added into the environment?
  • How many different applications touch the master entity data?
  • At which process points do these applications touch master entity data, and what are the dependencies among the different process points?
  • What are the time frames or delays associated with these process points?

The idea is to determine if there is a need for immediate synchronization, or if periodic synchronization is sufficient to satisfy business needs without introducing more confusion and complexity. If the latter is true, the next step is to determine the length of the period. This would ensure that even if the new entities are not added in real time, they will be incorporated into the master index within a window that remains consistent for the consuming applications.

This concept is an example of “eventual consistency” – a situation that balances delays in executing the transactions in a way that does not interfere with expected correct process execution. And of course, the period can be tuned to the needs of the applications, which provides some flexibility should the demands change over time. Adopting a strategy of eventual consistency for master data synchronization allows you to finesse some of the master data synchronization challenges while providing a level of service suited to the enterprise.

Share

About Author

David Loshin

President, Knowledge Integrity, Inc.

David Loshin, president of Knowledge Integrity, Inc., is a recognized thought leader and expert consultant in the areas of data quality, master data management and business intelligence. David is a prolific author regarding data management best practices, via the expert channel at b-eye-network.com and numerous books, white papers, and web seminars on a variety of data management best practices. His book, Business Intelligence: The Savvy Manager’s Guide (June 2003) has been hailed as a resource allowing readers to “gain an understanding of business intelligence, business management disciplines, data warehousing and how all of the pieces work together.” His book, Master Data Management, has been endorsed by data management industry leaders, and his valuable MDM insights can be reviewed at mdmbook.com . David is also the author of The Practitioner’s Guide to Data Quality Improvement. He can be reached at loshin@knowledge-integrity.com.

Leave A Reply

Back to Top