Vetting Eligibility: From Health Care Recipients to Migrant Refugees

Immigration is just one area where analytics can help expedite eligibility decisions. Image by Flickr user Jeff Warren
Immigration is just one area where analytics can help expedite eligibility decisions. Image by Flickr user Jeff Warren

A hot button issue this election season was the need to determine the eligibility of people for various government programs like the Affordable Care Act (ACA), Medicare and Medicaid, or entry into the United States as a migrant refugee.

“Look, we’re facing the worst refugee crisis since the end of World War II, and I think the United States has to do more, and I would like to see us move from what is a good start with 10,000 to 65,000 and begin immediately to put into place the mechanisms for vetting the people that we would take in,” Hillary Clinton said.

The idea of helping one’s neighbor is terrific; however, what data will be used to operationalize this vetting process for people who don’t exist in US-owned data sources?  How will we make determinations like “Welcome, Sally, you have been approved for entry.” Or, “Sorry, Bob, but your papers are not in order.”

To find our answer to this difficult social dilemma, let’s take a look at our ability to vet eligibility in government-sponsored Health Care programs (ACA, Medicare and Medicaid). Government entities have extensive data on the citizens served by these programs, and 40 years of experience vetting applicants through, presumably, refined processes.

Or maybe not. On July 15, 2015, ABC News reported that a year-long undercover investigation by the Government Accountability Office (GAO) of the ACA exchanges revealed:

“[T]he agency created 12 fake identities to attempt to obtain healthcare premium subsidies through the website. Over the internet and on the telephone, the federal exchange approved 11 of the 12 fraudulent applications, according to the report.”

The GAO findings also noted “the government doled out $2,500 per month or $30,000 per year in credits for insurance policies for these fabricated people, [after]undercover investigators supplied fabricated documents, including proof of income and citizenship, even fake Social Security numbers which were accepted with no questions asked.”

Similar to this story, many reports indicate an inability by federal and state governments to adequately mitigate fraud, waste and abuse in government-sponsored programs. As such, program integrity losses have ballooned over $100 billion annually.

With 13.8 million new ACA applicants expected in Q4 of 2016 (up 1.1 million), 32 states adopting Medicaid expansion under the ACA, and potentially stricter immigration policies under a Trump administration, the federal and state governments should strongly reconsider its current mechanisms for vetting.

What can be done to improve eligibility vetting?

To be successful with any eligibility issue, federal and state governments  will need a data management infrastructure that provides access to data across programs, products and channels, as well as the analytical tools to detect fraud and eligibility issues hidden in this data.

This won’t require a database overhaul or a massive central data warehouse, but rather a data integration layer that can source from databases around the organization, business partner organizations, and external public or purchased data.

They will also need data quality functions that support entity resolution, as unscrupulous individuals often provide inaccurate, incomplete or inconsistent information to prevent records matching across disparate systems. We need to know who’s who across systems.

In the digital age, analytics is needed to keep up with the growing sophistication and complexity of fraudsters and terrorists. The right data management, integration, and quality infrastructure must be supported by a robust business analytics foundation. To successfully analyze vast amounts of granular data, this data infrastructure must be able to:

  • Process large volumes of data quickly.
  • Handle the huge variety of data involved, including tables, documents, e-mail, web streams, videos and more.
  • Manage the velocity of data, which is increasing rapidly.

These data management capabilities are not only critical to everyday government business functions, but also to detecting and preventing nefarious individuals from gaining access to our programs or to the land we love.

The volume, variety, and velocity of data will keep growing, increasing the gap between big data and relevant data. For government organizations without analytical capabilities, this will create an information overload not seen before the digital age. This threatens to increase time to detection and action, and widen the cracks the wicked may exploit.

Eligibility is a big concern for attendees of the National Health Care Anti-Fraud Association Annual Training Conference. If you're there, come visit SAS at booth 213!






About Author

Ricky D. Sluder, CFE

Principal Solutions Consultant

Ricky D. Sluder, Certified Fraud Examiner, is a Principal Solution Architect in the Security Intelligence Practice at SAS. He has ~20 years of investigative experience in white collar crime, Medicare and Medicaid fraud, waste and abuse. In 2012, five major cases identified under his investigative/data analytic operational model resulted in DOJ criminal prosecutions exceeding $846 million in fraud.

Leave A Reply

Back to Top