David Loshin provides an alternate take on streaming data in the context of legacy systems.
David Loshin says entity resolution isn't a bandage to fix errors – it should be part of your data strategy.
In the extended enterprise, data integration challenges abound. David Loshin explains.
David Loshin says the hardest part of compliance is knowing if a data asset contains personal data, and ensuring you can protect it.
David Loshin describes some steps you can take to ensure that self-service data preparation improves collaboration.
David Loshin explains how to set up a data catalog that will help you get more value from a data lake.
David Loshin says simple approaches to identity resolution may not scale on a big data platform as data volumes increase.
David Loshin explains 4 struggles of syndicating master data across the enterprise.
David Loshin explains why MDM is such a valuable tool in helping to detect fraud.
David Loshin extends his exploration of ethical issues surrounding automated systems and event stream processing to encompass data quality and risk considerations.
David Loshin describes three sets of policies required for ensuring compliance with data protection directives for health care.
Health care fraud prevention is a sticky topic. David Loshin discusses what's needed to balance prompt claims payments with fraud prevention efforts.
I've been working on a pilot project recently with a client to test out some new NoSQL database frameworks (graph databases in particular). Our goal is to see how a different storage model, representation and presentation can enhance the usability and ease of integration for master data indexes and entity
As the application stack supporting big data has matured, it has demonstrated the feasibility of ingesting, persisting and analyzing potentially massive data sets that originate both within and outside of conventional enterprise boundaries. But what does this mean from a data governance perspective?
In my last post we started looking at the issue of identifier proliferation, in which different business applications assigned their own unique identifiers to data representing the same entities. Even master data management (MDM) applications are not immune to this issue, particularly because of the inherent semantics associated with the
I was surprised to learn recently that despite the reams of laws and policies directing the protection of personally identifiable information (PII) across industries and government agencies, more than 50 million Medicare beneficiaries were issued cards with a Medicare Beneficiary Number that's based on their Social Security Number (SSN). That's
One aspect of high-quality information is consistency. We often think about consistency in terms of consistent values. A large portion of the effort expended on “data quality dimensions” essentially focuses on data value consistency. For example, when we describe accuracy, what we often mean is consistency with a defined source
We often talk about full customer data visibility and the need for a “golden record” that provides a 360-degree view of the customer to enhance our customer-facing processes. The rationale is that by accumulating all the data about a customer (or, for that matter, any entity of interest) from multiple sources, you
In my prior posts about operational data governance, I've suggested the need to embed data validation as an integral component of any data integration application. In my last post, we looked at an example of using a data quality audit report to ensure fidelity of the data integration processes for