What really happens when a new entity is added to a master data environment? First of all, this is only done when it is determined that the entity is not already known to exist within the system. Abstractly, a significant amount of the process involves adding the new entity into
Tag: master data management
In my last post, I introduced a number of questions that might be raised during a master data integration project, and I suggested that the underlying subtext of synchronization lay at the core of each of those issues. It is worth considering an example of an application to illustrate those
A general paradigm for a master data management solution incorporates three operational components: An identity resolution engine. A master index. A master entity data repository. Conceptually, the identity resolution engine satisfies two core capabilities: the creation and management of unique identifiers associated with uniquely identified entities, and a matching capability
If you have been following this thread for a while, you may notice a theme that I keep bringing up: data virtualization. I'm trying to rectify a potential gap in the integration plan involving understanding the performance requirements for data access (especially when the application and database services are expected
To a great extent, the data manipulation layer of our multi-tiered master data services mimics the capabilities of the application services discussed in the previous posting. However, the value of segregating the actual data manipulation from the application-facing API is that the latter can be developed within the consuming application’s
Last time we started to discuss the strategy for applications to transition to using master data services. At the top of our master data services stack, we have the external- or application-facing capabilities. But first, let’s review the lifecycle of data about entities, namely: creating a new entity record, reading
In the last two series of posts we have been discussing the challenges of application integration with a maturing master data management (MDM) repository and index, and an approach that attempts to easily enable existing applications to incrementally adopt the use of MDM. This approach involves developing a tiered architecture
Sometimes you have to get small to win big. SAS Data Management breaks solution capabilities into smaller chunks – and deploys services as needed – to help customers reduce their total cost of ownership. SAS Master Data Management (MDM) is also a pioneer in "phased MDM." It's built on top of a data
In the big data era, I hear a lot about new and dynamic data sources that are giving companies a wide range of opportunities – and anxiety. When industry wonks talk about the “speeds and feeds” inherent in big data, they are often talking about an avalanche of transactional or
Last time we discussed two different models for syndicating master data. One model was replicating copies of the master data and pushing them out to the consuming applications, while the other was creating a virtual layer on top of the master data in its repository and funneling access through a