Uncategorized

David Loshin 0
The challenge of synchronizing the master index

What really happens when a new entity is added to a master data environment? First of all, this is only done when it is determined that the entity is not already known to exist within the system. Abstractly, a significant amount of the process involves adding the new entity into

Dylan Jones 0
Mass inspections are a means to a positive end

One of the key principles of W.Edwards Deming in his drive for greater quality in the US manufacturing industry was: “Cease dependence on inspection to achieve quality. Eliminate the need for massive inspection by building quality into the product in the first place.” Quality purists will repeatedly quote this principle

Phil Simon 0
Visualizing the next Whatsapp

There's been no shortage of eye-opening mergers and acquisitions activity in the news lately. Tumblr might have been the trojan horse of the more recent and expensive M&A activity; OcculusVR, Nest and Whatsapp have made many Wall Street "experts" ask the bubble question. With all this activity, one has to wonder if

Jim Harris 0
A double take on sampling

My previous post made the point that it’s not a matter of whether it is good for you to use samples, but how good the sample you are using is. The comments on that post raised two different, and valid, perspectives about sampling. These viewpoints reflected two different use cases for data,

David Loshin 0
Managing the master index: batch vs. real-time

In my last post, I introduced a number of questions that might be raised during a master data integration project, and I suggested that the underlying subtext of synchronization lay at the core of each of those issues. It is worth considering an example of an application to illustrate those

Dylan Jones 0
Functions are the missing data migration gap

Data migration projects are really a lesson in gap management. We are effectively building a complex web of technology and understanding to bridge the gulf between the legacy world and the target world. Most of the gap management in data migration revolves around solving data quality and mapping issues. We

Phil Simon 0
The case for contemporary dataviz and big data

We live in an era in which it's not terribly difficult for companies to ape many of their competitors' products and services, especially digital ones. For relatively small amounts of money (compared to years past), an organization can more or less mimic another's raison d'être and even specific functionality. As for design,

Jim Harris 4
Survey says sampling still sensible

In my previous post, I discussed sampling error (i.e., when a randomly chosen sample doesn’t reflect the underlying population, aka margin of error) and sampling bias (i.e., when the sample isn’t randomly chosen at all), both of which big data advocates often claim can, and should, be overcome by using all the data. In this

David Loshin 0
Integrating master services into the application environment

A general paradigm for a master data management solution incorporates three operational components: An identity resolution engine. A master index. A master entity data repository. Conceptually, the identity resolution engine satisfies two core capabilities: the creation and management of unique identifiers associated with uniquely identified entities, and a matching capability

Dylan Jones 1
How to grow a data quality culture that takes action

A data quality culture is one of those elusive outcomes that can so often make or break your long-term data quality vision. Whilst many project leaders are capable of fostering collaboration and communication in their immediate team, some leaders find it hard to build on this to grow greater data

1 58 59 60 61 62 105