The Data Roundtable
A community of data management expertsIn my last post, I discussed the issue of temporal inconsistency for master data, when the records in the master repository are inconsistent with the source systems as a result of a time-based absence of synchronization. Periodic master data updates that pull data from systems without considering alignment with in-process
The numbers are daunting. More than 40 million Americans have their identities stolen each year. Credit card companies lose more than $200 billion annually due to fraud. Cybercrime-related losses exceed $3 million per claim for large companies. If you’re like me, those stats are enough to give pause. To fuel the concern,
.@philsimon on the collision between the two.
Master data management (MDM) provides methods for unifying data about important entities (such as “customer” or “product”) that are managed within independent systems. In most cases, there is some kind of customer data integration requirement for downstream reporting, and analysis for some specific business objective – such as customer profiling for
I've spent much of my career managing the quality of data after it was moved from its sources to a central location, such as an enterprise data warehouse. Nowadays not only do we have a lot more data – but a lot of it is in motion. One of the
.@philsimon asks some fundamental questions about taking the next step with #bigdata.