Considering master data dependence and usability requirements


Last time we began to look at the opportunity for integrating a newly developed master data repository with existing business applications supported by a data warehouse. That meant understanding the requirements for integration and verifying that those requirements are met.

Verifying the observance of requirements involves specifying the variables that are measured, specifying the method of measurement, and specifying a level of acceptability for the measure. When it comes to the requirements for integrating a data warehouse with an MDM system, this involves identifying where the data warehouse applications touch master data, noting the type and the constraints of the relationship, and then determining that the master data dependence relationships are observed. Those dependence relationships are basically whether the master data entity’s values are searched for and located, read, and/or updated.

These assertions and dependences are in addition to typical expectations for data quality. Rather than looking at the level of trust of the specific values, they suggest expectations regarding the level of integration across the different consuming applications. Some examples are:

  • Consistency – Are the data values associated with master data entities that are stored in the data warehouse comparably consistent to the corresponding master data entity values in the master data repository?
  • Synchronization period – How frequently is there a data refresh performed between the two systems?
  • Timeliness – Is the data in the data warehouse delivered from the master data repository within the specified time period (e.g. on a daily basis)?
  • Coherence – Are other copies of data values taken from the master data repository coherent with the copy in the data warehouse?

The real questions focus on whether data warehouse applications access a master data entities and the parameters for ensuring the collection of accesses are consistent and whether results can be replicated. Next time: master data model requirements.


About Author

David Loshin

President, Knowledge Integrity, Inc.

David Loshin, president of Knowledge Integrity, Inc., is a recognized thought leader and expert consultant in the areas of data quality, master data management and business intelligence. David is a prolific author regarding data management best practices, via the expert channel at and numerous books, white papers, and web seminars on a variety of data management best practices. His book, Business Intelligence: The Savvy Manager’s Guide (June 2003) has been hailed as a resource allowing readers to “gain an understanding of business intelligence, business management disciplines, data warehousing and how all of the pieces work together.” His book, Master Data Management, has been endorsed by data management industry leaders, and his valuable MDM insights can be reviewed at . David is also the author of The Practitioner’s Guide to Data Quality Improvement. He can be reached at

Leave A Reply

Back to Top