The “tarnished record” – Alternatives to gold for fraud analytics

0

We often talk about full customer data visibility and the need for a “golden record” that provides a 360-degree view of the customer to enhance our customer-facing processes. The rationale is that by accumulating all the data about a customer (or, for that matter, any entity of interest) from multiple sources, you can apply sets of rules that will pick the best (perhaps “cleanest”) values from among the available data attributes and construct a golden customer record. The quality of the golden record is expected to supersede the quality level of any of the source records.

nurse holds a tarnished recordUnder the right circumstances and with appropriate controls, it might be argued that the processes associated with identity resolution, record linkage and the application of data value survivorship rules can support the provision of improved information for both operational and analytical purposes. For example, having current, accurate customer telephone information supports an operational aspect of customer support (such as automatically recognizing that a customer is calling from a telephone number associated with an account). It also supplements analytical profiles and patterns linked to location analytics used for real-time retention efforts.

But when it comes to some analytics applications, there are drawbacks to creating a single golden entity record. Individuals committing fraud are more likely to try to deliberately obfuscate their information as a way of hiding suspicious activities. So, instead of relying on a single golden record, a fraud investigator will want to see all of the data associated with a person of interest in its original format. The patterns of variation in name, location and contact information that can be linked to a uniquely identifiable individual support the investigation process. They also  allow the analyst to track transactions and behaviors using alias data that might be “washed away” in the data cleansing process.

I'm not saying that data cleansing and master data management should not be used. Actually, I'm suggesting the opposite, with a twist. That is, use identity resolution and master data management to link records about a uniquely identifiable entity together – but don't create a single golden record. Instead, configure your information management system to materialize what I call a “tarnished record.” The tarnished record  is a presentation of all the entity’s data in its original, raw format. Having this type of record allows the analyst to consider and configure that data in a way that will further the investigation process.

As an example, consider a healthcare kickback fraud in which a medical device provider funds the development of urgent care centers in return for a monopoly on supplying those centers with its products. To mask the activity, the device provider uses alternate representations of the company name in associated documentation. The fraud analyst will want to determine that the same individuals are involved in each fraudulent scenario and be able to demonstrate that the individuals are deliberately using name variations to avoid detection. A golden record doesn't provide that evidence trail. The tarnished record will.

In other words, a combination of identity resolution and record linkage can improve the usability of data for analytical purposes, even without creating a single golden record. As more supplemental data is linked to the original varied representations, you may be able to construct a more detailed view of a uniquely identifiable person of interest. The result will be even better support for the concept of the 360-degree view.


Read 8 ways an enterprise data strategy enables big data analytics

Share

About Author

David Loshin

President, Knowledge Integrity, Inc.

David Loshin, president of Knowledge Integrity, Inc., is a recognized thought leader and expert consultant in the areas of data quality, master data management and business intelligence. David is a prolific author regarding data management best practices, via the expert channel at b-eye-network.com and numerous books, white papers, and web seminars on a variety of data management best practices. His book, Business Intelligence: The Savvy Manager’s Guide (June 2003) has been hailed as a resource allowing readers to “gain an understanding of business intelligence, business management disciplines, data warehousing and how all of the pieces work together.” His book, Master Data Management, has been endorsed by data management industry leaders, and his valuable MDM insights can be reviewed at mdmbook.com . David is also the author of The Practitioner’s Guide to Data Quality Improvement. He can be reached at loshin@knowledge-integrity.com.

Related Posts

Leave A Reply

Back to Top