Operationalizing compliance with data protection policies


In my last post I discussed the recent creation of a health care industry public-private initiative called the Healthcare Fraud Prevention Partnership (HFPP) designed to help health care and law enforcement agencies identify and prevent health care fraud. I inferred from the description that one of their intents is to create a massive data repository collecting information about all players in the health care claims and payment process – individual providers, provider organizations, health care companies, beneficiaries of government health care programs, as well as members of participating private payer companies. The ability to effect this kind of data repository depends on numerous mutual assurances of data protection. Recognize that although the participants of the partnership are united when it comes to preventing fraud, many of them operate in a rapidly changing and competitive market. Clearly, one health insurance company is not going to share its member data if there is a risk that its local market competitor will be able to harvest that member data and start recruiting new members!

medical professional considering data privacy and complianceAn interesting combination of data integration and data protection will be integral to making this partnership work, and it falls to a “trusted third party” (which I will refer to as the TTP) to act as the acquirer, broker and guarantor of the data.

There must be three sets of policies for ensuring compliance with the specified data protection directives while enabling data integration:

  • Data security. Aside from the typical perimeter security that is clearly required to prevent system breaches, there must be additional data security policies and procedures in place, such as role-based access, data encryption at rest and in motion, and data masking.
  • Predictable and trustworthy identity resolution services. The data would apparently be delivered to the TTP, yet no partner’s data will be exposed to any other partner. That means that the TTP must be able to ingest data from numerous data sources, standardize the data representations, submit the records to an identity resolution engine, link records together, and then provide a means for matching query requests against this master index without exposing any proprietary information. A services approach would allow queries and analytical algorithms to be run to provide predictive or prescriptive models that sit behind a data protection firewall.
  • Effective data de-identification (and re-identification). This means the ability to transform protected data into an unidentifiable format. However, the nature of the fraud detection and prevention applications relies on the identity of individuals involved. This means that although names must be changed within the system, those same names must remain searchable and matchable (see “Predictable and trustworthy identity resolution services” above) to support the fraud detection patterns.

Without approved policies and procedures around data protection, I cannot imagine that a TTP can build any effective data repository and fraud prevention system. It is clear that governance and compliance are the actuating factors in making this kind of environment a reality.

Learn how SAS can help you identify, govern and protect personal data

About Author

David Loshin

President, Knowledge Integrity, Inc.

David Loshin, president of Knowledge Integrity, Inc., is a recognized thought leader and expert consultant in the areas of data quality, master data management and business intelligence. David is a prolific author regarding data management best practices, via the expert channel at b-eye-network.com and numerous books, white papers, and web seminars on a variety of data management best practices. His book, Business Intelligence: The Savvy Manager’s Guide (June 2003) has been hailed as a resource allowing readers to “gain an understanding of business intelligence, business management disciplines, data warehousing and how all of the pieces work together.” His book, Master Data Management, has been endorsed by data management industry leaders, and his valuable MDM insights can be reviewed at mdmbook.com . David is also the author of The Practitioner’s Guide to Data Quality Improvement. He can be reached at loshin@knowledge-integrity.com.

Related Posts

Leave A Reply

Back to Top