Technical requirements for MDM/DW Integration

0

Master data not only serves as an index and registry for entities that are managed within data warehouses, they can also be used as tools to drive and support the consolidation of multiple data warehouses into a single, unified data warehouse.

The assessment of the dependence between the data warehouse applications and the master data entities and attributes shed light on the commonalities between different data warehouses and guides the use of the master index for unifying siloed master entity views. They can then be serviced directly from the master repository instead of inadvertent replicas copied into each warehouse.

That suggests another set of requirements for integrations focusing on the operational and technical aspects of a master data environment. Those technical requirements are driven by ensuring compliance with the dependence, model and performance requirements, such as:

  • Requirements for interaction with MDM repository during consolidation, population and loading of data warehouses
  • Requirements for MDM services during interactive uses of the data warehouse
  • Requirements for MDM services performed on a periodic basis (e.g. nightly synchronization)

However, probably the biggest technical driver focuses on the ease-of-use, precision and accuracy of the identity resolution. Integration with a data warehouse implies integration with some number of downstream business processes that may be reliant on the aggregation and summarization based on hierarchies of unique entities. Simple reporting by individual, location and time can be thrown off when multiple records for the same individual (or location) are not matched prior to creating the report. Complex activities such as spend analysis or fraud detection are highly reliant on uniquely identifying (respectively) products and individuals.

Therefore, it is critical as part of any MDM/DW integration project plan to engage the end users early in the process to solicit information about how master data entities are located, read and updated as part of the business processes. The technical requirements for managing the underlying master data services are driven by functional interaction needs bolstered by the proper levels of accuracy and precision in the underlying identity resolution algorithms.

Share

About Author

David Loshin

President, Knowledge Integrity, Inc.

David Loshin, president of Knowledge Integrity, Inc., is a recognized thought leader and expert consultant in the areas of data quality, master data management and business intelligence. David is a prolific author regarding data management best practices, via the expert channel at b-eye-network.com and numerous books, white papers, and web seminars on a variety of data management best practices. His book, Business Intelligence: The Savvy Manager’s Guide (June 2003) has been hailed as a resource allowing readers to “gain an understanding of business intelligence, business management disciplines, data warehousing and how all of the pieces work together.” His book, Master Data Management, has been endorsed by data management industry leaders, and his valuable MDM insights can be reviewed at mdmbook.com . David is also the author of The Practitioner’s Guide to Data Quality Improvement. He can be reached at loshin@knowledge-integrity.com.

Leave A Reply

Back to Top