Pennsylvania principal uses student projections to foster academic success for more students

Student projections can help foster growth, regardless of achievement level.

Student projections can help foster growth, regardless of achievement level.

This student projections post kicks off a series highlighting education leaders sharing the celebrations, and challenges, of using data to drive school improvement. These are real teachers, principals and superintendents working to foster academic growth for every student in their schools and districts. As we near the end of the first semester of school, I hope you find these stories inspiring, informative and helpful as we transition into the next phase of the school year. I’ll start with a focus on using predictive analytics.

A favorite part of my job is meeting and hearing from educational leaders in other states how they’ve been able to improve student outcomes. A recent trip to Pennsylvania provided insight into how one administrator has created a data culture in his school and used data as a school improvement resource.

Aaron Taylor is the principal at Waynesboro Middle School in Waynesboro, Pennsylvania. Mr. Taylor got his start as a learning support teacher in an elementary school and more recently won the Governor’s Award for his work as principal at an elementary school. A few years ago, Mr. Taylor’s superintendent asked him to lead a middle school that was struggling to raise student achievement. He was tasked with growing all students, a challenge he gladly accepted.

Mr. Taylor’s mission upon arriving at Waynesboro Middle School was to build a culture of data literacy and ensure data was used as a flashlight to guide decisions. An important piece of this plan was to use student projections to maximize academic opportunities for every student.

Student projection reports provide insight into a student’s potential in various courses. These reports project to academic milestones in select courses including state assessments and college readiness indicators such as Advanced Placement, ACT and SAT. Put simply, projection reports show the probability of a student’s likelihood of proficiency on a specific assessment. This information can be used to help with course placement, interventions and enrichment for students.

At the beginning of each school year, Mr. Taylor uses projections to optimize scheduling. “We want to ensure that every student in our school is placed in the best course to promote academic growth” noted Mr. Taylor. Predictive data provides an objective way for administrators to consider course placement that can potentially open college and career doors for students. This valuable information allows Mr. Taylor and other leaders at the school to place all students in the most appropriate, academically challenging course.

The use of projections did not stop with scheduling. By using projections throughout the school year, Mr. Taylor noted that the highest achieving students were not showing growth. He wanted to be sure every student, whether high or low achieving, were provided an opportunity in the school day to push themselves academically.

Mr. Taylor created the WIN (What I Need) in the school day. Without adding time to the school day, school leaders adjusted the schedule to accommodate an additional class period in the day. During this period, students received additional instruction for core subjects such as English Language Arts, Math and Science.  Using projection data, teachers of WIN can provide more individualized instruction. For high achieving students, WIN is an opportunity for enrichment. For low achieving students, WIN provides remediation and intervention to support progress in the course. Projection data allows for WIN to foster progress for every student.

When asked if this had made a difference, Mr. Taylor replied “No doubt. We have hit a home run by helping all of our students make progress.” When growth data was shared with the school following the implementation of WIN, high achieving students showed growth for the first time in a few years.

Maya Angelo once said, “When we know better, we do better.” Projection data provides us valuable information on our students and allows educators an opportunity to enrich, intervene and remediate as needed. Predictive data helps school leaders to ensure that every student’s progress matters. How will you use this data?

Post a Comment

Entity resolution reduces unknown unknowns in child well-being efforts

When protecting children, “entity resolution” can reduce the things agencies don’t know they don’t know. It’s a technological capability I wish I’d had when I led a child protective services agency.

I have been involved in hundreds of determinations that would alter a child’s life trajectory. I was comfortable with that heavy responsibility as I believed I was making the “most informed” decision possible, with all of the information we had available to us.  Some of you may feel the same way I did.

We now have the ability to improve our “most informed” decision, and it starts with data quality. Many child welfare leaders acknowledge that data quality is an ongoing issue and make efforts to de-duplicate and merge files at a surface level.  This is a start, but we can do much better.

Learning from recent analytic work by SAS in child welfare systems, we know that the #1 predictor of future maltreatment, including fatality, is a child’s generational abuse history.  We also know that case management systems have a robust amount of case history, but data quality limits the ability to surface all historical information needed to help workers make the most informed child safety decision possible. Currently, “they don’t know what they don’t know”.

Enter “entity resolution”, which not only de-duplicates and merges (inch deep and a mile wide approach), but also uses link analysis to gain a clearer picture of relationships between children and adults at different levels. Data scientists from the SAS Advanced Analytics Lab for State and Local Government have identified that the network of these relationships plays a critical role in identifying high-risk children.

Entity resolution will enable decision makers to connect all of the dots and allow them to save lives by doing the following 2 things:

Create unique person ID’s or Key Identification (KID) to using ALL possible data available. There is a high likelihood that an initial inspection of the data would reveal individuals in a case management system assigned to multiple person IDs. We want one ID per person.

Allow for a better understanding of the relationships between a child and others in the child’s event history. This provides information on potential risk factors by looking at 3 layers of relationships.

  • Previous reports and intakes for a selected child
  • Previous reports and intakes for those in the same cases as the selected child
  • Previous reports and intakes for those linked to the cases of the selected child but not including the selected child.

Child tragedy reveals widespread family abuse

Below is a real life case study demonstrating the impact and added value of entity resolution and the critical information that it provides the worker and/or supervisor. The first diagram shows the available case history (without entity resolution) of a case resulting in the fatality of a young 4-month-old child where the mother (P-M) & aunt (P-A) were both perpetrators of the fatality of a twin (P-M T4).   The death was alleged to be due to neglect and inadequate supervision, and the cause of the death reported was Sudden Unexpected Infant Death.

A view of a child fatality, without entity resolution.

A view of a child fatality, without entity resolution. Click to enlarge.

Following entity resolution across social networks, the following history became available demonstrating historic generational abuse going back to the child’s grandparents over a 10-year history:

A more complete view, with entity resolution, reveals much more about the family's history.

A more complete view, with entity resolution, reveals much more about the family's history. Click to enlarge.

What’s important, and distressing, to understand is that each of the solid arrows represent at least one allegation of abuse by that person of the individual at the end of the arrow. For instance, at the time of the twin’s fatality, there were additional open cases that included the grandfather (GF), five siblings of the mother (D1, D2, S1-S3), a child of one other aunt (D1-D), and three children of the perpetrator aunt (P-A-D1, P-A-D2, P-A-S1) who was married to one of the uncles (S3). All of the children in the investigation were listed as alleged maltreatment victims in the fatality report.

The deceased child did not have an earlier maltreatment allegation report.  However, there were two earlier reports in the spring of year 10 in which the mother (P-M) was listed as a perpetrator and her older three children (P-M-T1, P-M-T2, P-M-D1) as victims. In addition, the grandmother (GM) and the aunt (P-A) were listed as perpetrators and children of the aunt (P-A-D1, P-A-D2, P-A-S1) as victims in these reports. The child of the other aunt (D1-D) was also a victim. Meanwhile, the perpetrator aunt (P-A) was a victim of maltreatment perpetrated by her parents who are not found in the child welfare system (“?”).

Tracking these families back to years 1 to 5, there were several counts of alleged maltreatment involving multiple members of these families. The grandmother (GM) and grandfather (GF) were listed as perpetrators several times maltreating their six (6) children, including the perpetrator mother (P-M). Therefore, the mother (P-M) appeared as a victim earlier in her life. Similarly, the perpetrator aunt (P-A) was abused multiple times together with her sibling (P-A-S) by her family between years 1 and 5. Therefore, both the aunt (P-A) and her husband (S3) were abused in the past and they eventually became perpetrators to their own children.

After counting all roles, at the time of the fatality event, total number of report histories for the deceased child was 127. There were also two perpetrators as victims (mother and the aunt) and many counts of intergenerational maltreatment. As explained earlier, SAS research has found that intergenerational abuse is the heavily weighted risk factor. Report histories and perpetrator-as-victim count were the two strongest predictors in the early-fatality model, which would have been applicable to the deceased child.

Entity resolution gives clearer view of risk

Without entity resolution across the levels of relationships, the above information currently cannot be provided to workers and supervisors making life and death decisions.  This type of solution has a massive impact on risk scoring and should be seen as foundational for any child welfare organization that is exploring or currently operationalizing predictive analytic risk scoring.

Let me demonstrate why this is so critical using the case study above.  Without entity resolution, the case above would not have been identified as having intergenerational abuse history and would have scored in the 82.7 risk percentile.  With entity resolution the above case identifies intergenerational abuse and increases the risk score of this specific case dramatically, allowing workers to understand the true scope of the youth’s history as seen below:

Entity resolution reveals a clearer picture, and a dramatic increase in risk.

Entity resolution reveals a clearer picture, and a dramatic increase in risk.

Like I said, workers and supervisors may not know what they don’t know. But that doesn’t have to be the case. With entity resolution, child welfare agencies can ensure they have all relevant history on individuals in their case management system so they can make the most informed “most informed” decision they’ve ever made.

Post a Comment

Data integration can alleviate citizen frustrations with government

Has this been you? Have you been on the receiving end? Data integration can help government agencies avoid these calls. Image by Flickr user Ian Muir

Has this been you? Have you been on the receiving end? Data integration can help government agencies avoid these calls. Image by Flickr user Ian Muir

Data integration helps a successful business make things simple and quick for customers, and keeps them coming back. While a company will have data silos, data held within one area is made available to others in order to help the customer.  In most local, county and state governments that is not the case. Mandates and history have allowed various agencies to become independent and operate under their own set of rules, yet each is serving the exact same customer.

Have you ever asked questions like:

  1. Why does government operate so slowly?
  2. Why or how they are so off in their financial estimates?
  3. How could that person have gotten away with taking bribes?
  4. Why do I have to fill out the same documents at four different agencies to get a single permit?

When I first went to work in local government, I just knew I could help make things better. That was not the case.  I was shocked at the often-contradictory local, state and federal mandates that severely decreased efficiencies.  Citizens are demanding government be more transparent and accountable, and in many ways, government is responding. However, without data integration to knock down information silos, departments and agencies will continue to operate independently, with their own rules and direction.

For example, let’s take a developer who is pursuing local and state permits for a manufacturing facility.  Once the permits are in place, they will hire local construction workers, build the facility and then the manufacturer moves in and hires local workers.  To get this process started, however, the developer will be forced to interact with several local, county and state agencies for various reasons via multiple phone calls, text messages, emails, onsite visits, etc.  This is just to obtain the necessary permits to start construction. This arduous process delays economic prosperity for both the community and its citizens. An unconstructed facility generates no property tax, and denies citizens employment opportunities, which in turn stresses the community’s social infrastructure.

Much of what a developer submits is duplicative, but each agency -- because of mandates and history -- requires separate submission of data into their own system, be it manual or electronic. This redundant requirement is time-consuming for the developer, adds layers of bureaucracy and increases opportunities for delays and mistakes. This extends the development timeline, drives up project costs, increases the risk of corruption, clouds transparency and ultimately delays job creation, which is the heart of a community’s existence.

Today’s technology allows databases to be interconnected, yet independent, all while limiting access to only the needed information. In this example, interconnected databases would:

  1. Bring a developer’s project online quicker by providing a single point of submission and allow each agency to see where the project is in the process, the delays that are occurring and why, concerns by other agencies and mitigation efforts to address the concerns, etc.
  2. Increase transparency, as all agency notes are visible, reducing the opportunity for “back room” deals.
  3. Ensure alignment with all rules and regulations.
  4. Foster job creation for citizens much quicker.

The result?  A more vibrant economy and increased tax collections, all without increasing individual taxes.

There are many more examples of how interconnecting databases can create efficiencies:

  1. Tax fraud detection increases collections by identifying tax evaders.
  2. Welfare fraud detection eliminates payments to those abusing the system, freeing up dollars for those in need.
  3. Real-time data improves day-to-day government decision-making, so no more using historical data to make decisions for today.
  4. Workforce and economic development helps increase citizens’ job skills and recruit/broaden job opportunities while diversifying the economy, which in turn drives long-term economic sustainability.

Better criminal justice collaboration protects the public by keeping criminals off the street and helps law enforcement make better decisions

Simply put, data integration increases transparency while providing savings through efficiencies that can be passed on to citizens through better roads, schools, economic development efforts. All of this leads to maintaining or lowering our annual tax bills and ensuring our communities prosper for years to come.

So, when citizens decry stalling by local officials and start movements to institute another mandate to force openness and efficiency, maybe a second look is warranted. What may be needed is not another mandate but for government agencies to work closer together by integrating their data.

Post a Comment

Accountability and teacher preparation: How states have led the way

As teacher preparation programs send educators out into the workforce, how can we make sure they're effective in the classroom? Image by Flickr user Visha Angelova

As teacher preparation programs send educators out into the workforce, how can we make sure they're effective in the classroom? Image by Flickr user Visha Angelova

Teacher preparation programs have received some pretty harsh criticism in recent years. For example…

“If there was any piece of legislation that I could pass it would be to blow up colleges of education.” –Reid Lyon, National Institute of Health

“By almost any standard, many if not most of the nation’s 1,450 schools, colleges, and departments of education are doing a mediocre job of preparing teachers for the realities of the 21st-century classroom.  America’s university-based teacher preparation programs need revolutionary change, not evolutionary thinking.”-Arne Duncan, Secretary of Education

In response to such criticisms, the US Department of Education (ED) undertook a lengthy and arduous revision of regulations in the Higher Education Act pertaining to the accountability structures in teacher preparation.

Released in 2016, the final regulations require that states report annually on a variety of indicators for all teacher preparation programs (TPPs), including traditional programs in colleges and universities, alternative programs such as Teach for America, and transitional programs aimed at filling vacancies within schools. The final document contains almost 700 pages of regulations, which the ED press office represented with the broad set of indicators listed below:

  • Placement and retention rates of graduates in their first three years of teaching, including placement and retention in high-need schools;
  • Feedback from graduates and their employers on the effectiveness of program preparation;
  • Student learning outcomes measured by novice teachers' student growth, teacher evaluation results, and/or another state-determined measure that is relevant to students' outcomes, including academic performance, and meaningfully differentiates amongst teachers; and
  • Other program characteristics, including assurances that the program has specialized accreditation or graduates candidates with content and pedagogical knowledge, and quality clinical preparation, who have met rigorous exit requirements.

Previous state accountability structures for teacher preparation relied mainly on input measures such as enrollment, graduation rates, and course curriculum.  These new regulations will require a seismic shift for most states moving from input-based measures (primarily captured by the teacher prep program) to output-based measures (based on performance in the field.) Let’s look at the approaches of Tennessee and Louisiana, which have a long history of providing this type of analysis on their TPPs.

Tennessee Teacher Preparation Report Card

Since 2007, Tennessee has produced a Report Card on the Effectiveness of Teacher Training Programs as a joint effort among the Tennessee Department of Education, Tennessee State Board of Education (SBE), and the Tennessee Higher Education Commission.  State law required that these agencies report on the effectiveness of TPPs as defined by three measures:

  • Placement and retention rates of program completers;
  • Pass rates on Praxis teacher certification exams; and
  • Tennessee Value-Added Assessment System (TVAAS) scores of program completers.

In addition to these measures, the state has added other indicators such as demographic information and longitudinal analysis throughout various revisions of the report.

Tennessee was one of the first states to link the performance of graduates in the classroom, as measured by TVAAS, with the program where they were trained. TVAAS is the state-wide student growth measure. The state differentiated performance by program completers of traditional and alternative programs as well as by subject area and grade span. This allowed programs to evaluate their impact on student learning once their graduates were in the field. The state worked with programs to refine the analysis each year to provide more detailed and useful information back to the programs for continuous improvement.

Always looking to improve the depth of information, the state is redesigning reports to include performance levels and more detailed information about each program.

Louisiana Teacher Preparation Data Dashboard and Fact Book

Louisiana is another state with a longstanding tradition of providing publicly available information on the effectiveness of TPPs.  The Louisiana Board of Regents produced the Louisiana Teacher Preparation Fact Book for almost a decade.  Now known as the Louisiana Teacher Preparation Data Dashboard, this report provides program information on enrollment, demographics of program enrollees and completers, placement and retention rates of program graduates, and student impact of completers as measured by value-added and other evaluation metrics.

Both Louisiana and Tennessee have the benefit of longstanding, longitudinal data systems linking teachers and students, as well as connecting data from K-12 to teacher preparation programs. Many states do not have such data systems or have systems still in development.

States must also grapple with the measures used to evaluate program effectiveness.  Both Louisiana and Tennessee have robust value-added measures that provide meaningful differentiation among completers and programs. The complexity of these measures provides a level of confidence that more simplistic measures do not.

For states beginning this work, the implementation of the finalized ED regulations may require the development of a new data system, the collection of new elements, and work across agencies and organizations that may have never worked together before.  Over the next few blogs, we will delve into the area of teacher preparation accountability systems to learn more about what leading states have done in this area, pose questions to consider for states as they build these systems, and take a look at some research on what makes teacher preparation effective.

Post a Comment

Procurement fraud: The economic crime no one seems to talk about

Procurement fraud ain't first, but it sure ain't last. Shake and bake! Image by Flickr user Zach Catanzareti Photo

Procurement fraud ain't first, but it sure ain't last. Shake and bake! Image by Flickr user Zach Catanzareti Photo

Anyone know what's the number two form of economic crime, in terms of losses? Believe it or not, it's procurement fraud.

I grew up in a small town south of “Big D” and in my neck of the woods having two first names is, well, normal.  So, when Will Farrell’s character, Ricky Bobby (my nickname-sake), uttered those fateful words in Talladega Nights, “If you ain’t first, you’re last”, I just thought it was funny. But when I applied those words of wisdom to what I do professionally (Fraud Fighter), I realized something.

Asset misappropriation, or theft, is the number one form of economic crime on the planet.  We hear about it every day in some form of media; but how often do we hear about number two, procurement fraud?

According to PwC’s 2014 Global Economic Crime Study, 29% of all organizations are impacted by this economic crime every year. That’s 1/3 of every government agency and private or publicly traded business on earth. Yep, folks, that’s a heck of a lot!

Calculating the global losses to procurement fraud has proven very difficult, with little in the way of published facts and figures for this highly unreported crime. However, if we look at just the US federal government, the Department of Justice reported, “Settlements and judgments in cases alleging false claims for payment under government contracts totaled $1.1 billion in fiscal year 2015.”

I know what you are thinking…only $1.1 billion in 2015…doesn’t the federal government concede to program integrity losses greater than $100 billion each year?  Yep, but the $1.1 billion figure is for settlements and judgments, meaning DOJ civil or criminal charges were pursued.  If my Criminal Justice degree serves me correctly, when 100 people commit a crime, one of them will be prosecuted. So, doing that math, it’s feasible that procurement fraud could be as costly as our federal program integrity losses.

What can be done to address procurement fraud?

According to my colleagues, Jen Dunham and Jon Lemon, who collaborated together, in the SAS Insights article, How to detect and prevent procurement fraud - The case for a hybrid analytical approach, they outline a really good way to tackle this problem.

Business rules are a good place to start. If bidders show up on a disbarred list, don’t give them a contract. If too many invoices come in on the same day, check them out. Simple enough. However, business rules typically only catch simple schemes and data entry errors.

Anomaly detection looks for behaviors that are unusual or unexpected.

  • Historical anomaly detection looks at changes in behavior over time. If the system sees a sudden, drastic shift from historical patterns – with nothing to explain it – this would be flagged and factored into the overall fraud risk score.
  • Peer grouping or clustering compares one’s behavior to the norm for a similar peer group and identifies behaviors that are drastically different from what would be expected for that group or type of procurement.
  • Profiling defines the typical attributes of good guys and bad guys. When it sees a pattern that matches that of known fraudsters, the system recognizes and flags it accordingly.

Text mining identifies patterns and anomalies from unstructured data, such as reports and social media. For example, if a procurement officer who makes $65,000 a year posts pictures of extravagant purchases to social media, you might want to check it out.

With advanced analytics, you can build models that identify attributes or patterns that are highly correlated with known fraud, even for complex and emerging schemes. Analytics answers questions that manual or ad-hoc methods miss. Does this look like the typical habit of bid riggers or those known for counterfeit parts? Does this series of invoices, stair-stepping up and down in dollar value, indicate a vendor trying to find the threshold of scrutiny?

Since much procurement fraud involves collusion, associative linking is invaluable. Link analysis finds relationships among entities based on static attributes (such as phone numbers, addresses or bank accounts) or transactional attributes (business relationships, referrals, etc.). A relationship might be innocuous, but even for valid business you want to be able to show you have done due diligence vetting relationships.

Independently, each method is good at detecting a certain type of fraud, but when used in combination, you can see so much more.

Procurement fraud may not be first but we can’t treat it like it’s last. Time to shake and bake the problem with analytics. (Just see the movie.)

Post a Comment

The value of third-party data in tax fraud detection

In fighting tax fraud, agencies shouldn't pay money for the wrong third-party Data. Image by Flickr user TheMOX

Tax agencies, don't pay for the wrong Data. Image by Flickr user TheMOX

There are several ways to buy data, and even more companies who are willing to sell it. By annual subscription or by the drink, third-party data vendors promise they can solve your identity theft and non-compliance problems. It’s as simple as signing a contract, and letting the data tap begin to flow. Right?

For tax agency directors who are considering going down this road – and even those already on the path to third-party data righteousness – hearing the experiences of a data-agnostic tax analytics practitioners like SAS can be helpful. SAS engages third-party data vendors to fulfill project requirements, purchasing only the data needed to achieve the business goal.

Data vendors bring vast amounts of public and private data to bear on a problem, like identity theft. They also help the tax agency further validate the taxpayer by sending questionable filers to a knowledge-based authentication (KBA) quiz. This is fundamentally a data matching exercise which answers a singular but very important question, “Is this person who they say they are?”

In contrast, in the fight against refund fraud, analytics practitioners like SAS pick up where identity authentication leaves off. Fundamentally, analytics enables a “preponderance of evidence” standard in compliance by performing statistical analysis of taxpayer behavior. Analytics detects non-compliance of all types and can be used for refund fraud detection as well as audit selection. Analytics can answer the following questions for all taxpayers – not only those requesting a refund:

  • Does the behavior of this taxpayer seem normal and compliant with the law?
  • Is the behavior consistent with what government and third-party data is saying about the taxpayer?
  • Is this taxpayer behaving in the same way as other, similar taxpayers? Are they behaving like known fraudsters?
  • Is there a significant change in the taxpayer’s behavior over time?
  • Is this taxpayer’s behavior linked to others?  Are they part of a larger crime ring?

To draw an analogy, if identity authentication using third-party data is the big white fence you build around your property to keep out strangers, then analytics is the nanny-cam you put in your child’s room.  You’re confident in that strong fence you bought – as you should be. But you’re smart enough to know not to use it as the sole source of security for protecting your child. It’s should be one layer in a multi-tiered defense.

More and more vendors are offering one-stop shopping for third-party data. However, some third-party data elements fail to provide the expected insight and value, while other internal data sources unexpectedly provide tremendous lift in investigations.

Instead of engaging wholesale and exclusively with one particular data vendor, a tax agency is better served by working with an analytics vendor to first clearly define the business requirements. Only then can an agency identify specific data elements such as credit scores, real property assets, and sex offender lists that add measurable value. A vendor can then work in tandem with the state to determine where that data can be obtained at the lowest cost. Additionally, a vendor can help the state evaluate and inventory existing state contracts with data providers to ensure the state never double pays for data to which they already have access. After that point, it can be determined what additional data elements might need to be purchased that are not already available.

Third-party data providers tend to have certain strengths and weaknesses. One may have stronger business tax data than individual income tax data, or have sparser data from a particular state. There are firms that specialize only in niche areas like incarceration or transfer pricing data. And as a result, their data might have a stronger validity level for those data elements than another firm that offers one stop shopping and “does it all.”

So is third-party data worth all the hype? Absolutely, if used efficiently in areas where value can be measured. However, caveat emptor applies here and tax agency directors should lean heavily on others who have gone down this road before to gain the lessons learned the easy way. And next time you put your children to bed, ask yourself, “Is that big fence keeping them completely secure, or should we be doing more?”

Post a Comment

Vetting Eligibility: From Health Care Recipients to Migrant Refugees

Immigration is just one area where analytics can help expedite eligibility decisions. Image by Flickr user Jeff Warren

Immigration is just one area where analytics can help expedite eligibility decisions. Image by Flickr user Jeff Warren

A hot button issue this election season was the need to determine the eligibility of people for various government programs like the Affordable Care Act (ACA), Medicare and Medicaid, or entry into the United States as a migrant refugee.

“Look, we’re facing the worst refugee crisis since the end of World War II, and I think the United States has to do more, and I would like to see us move from what is a good start with 10,000 to 65,000 and begin immediately to put into place the mechanisms for vetting the people that we would take in,” Hillary Clinton said.

The idea of helping one’s neighbor is terrific; however, what data will be used to operationalize this vetting process for people who don’t exist in US-owned data sources?  How will we make determinations like “Welcome, Sally, you have been approved for entry.” Or, “Sorry, Bob, but your papers are not in order.”

To find our answer to this difficult social dilemma, let’s take a look at our ability to vet eligibility in government-sponsored Health Care programs (ACA, Medicare and Medicaid). Government entities have extensive data on the citizens served by these programs, and 40 years of experience vetting applicants through, presumably, refined processes.

Or maybe not. On July 15, 2015, ABC News reported that a year-long undercover investigation by the Government Accountability Office (GAO) of the ACA exchanges revealed:

“[T]he agency created 12 fake identities to attempt to obtain healthcare premium subsidies through the HealthCare.gov website. Over the internet and on the telephone, the federal exchange approved 11 of the 12 fraudulent applications, according to the report.”

The GAO findings also noted “the government doled out $2,500 per month or $30,000 per year in credits for insurance policies for these fabricated people, [after] undercover investigators supplied fabricated documents, including proof of income and citizenship, even fake Social Security numbers which were accepted with no questions asked.”

Similar to this story, many reports indicate an inability by federal and state governments to adequately mitigate fraud, waste and abuse in government-sponsored programs. As such, program integrity losses have ballooned over $100 billion annually.

With 13.8 million new ACA applicants expected in Q4 of 2016 (up 1.1 million), 32 states adopting Medicaid expansion under the ACA, and potentially stricter immigration policies under a Trump administration, the federal and state governments should strongly reconsider its current mechanisms for vetting.

What can be done to improve eligibility vetting?

To be successful with any eligibility issue, federal and state governments  will need a data management infrastructure that provides access to data across programs, products and channels, as well as the analytical tools to detect fraud and eligibility issues hidden in this data.

This won’t require a database overhaul or a massive central data warehouse, but rather a data integration layer that can source from databases around the organization, business partner organizations, and external public or purchased data.

They will also need data quality functions that support entity resolution, as unscrupulous individuals often provide inaccurate, incomplete or inconsistent information to prevent records matching across disparate systems. We need to know who’s who across systems.

In the digital age, analytics is needed to keep up with the growing sophistication and complexity of fraudsters and terrorists. The right data management, integration, and quality infrastructure must be supported by a robust business analytics foundation. To successfully analyze vast amounts of granular data, this data infrastructure must be able to:

  • Process large volumes of data quickly.
  • Handle the huge variety of data involved, including tables, documents, e-mail, web streams, videos and more.
  • Manage the velocity of data, which is increasing rapidly.

These data management capabilities are not only critical to everyday government business functions, but also to detecting and preventing nefarious individuals from gaining access to our programs or to the land we love.

The volume, variety, and velocity of data will keep growing, increasing the gap between big data and relevant data. For government organizations without analytical capabilities, this will create an information overload not seen before the digital age. This threatens to increase time to detection and action, and widen the cracks the wicked may exploit.

Eligibility is a big concern for attendees of the National Health Care Anti-Fraud Association Annual Training Conference. If you're there, come visit SAS at booth 213!

 

 

 

 

Post a Comment

3 reasons why International Fraud Awareness Week matters to government

International Fraud Awareness Week is here!

"Comparing my week to International Fraud Awareness Week, Shaun? Please jump in the water to discuss further..." Image by Flickr user Luigi Mengato

"Comparing my week to International Fraud Awareness Week, Shaun? Please jump in the water to discuss further..." Image by Flickr user Luigi Mengato

I know, I know… Fraud Week is not quite as exciting as Shark Week.  It doesn’t appeal to your taste buds like Restaurant Week. Nor does it have the quirky feel of Brain Awareness Week (nope… I’m not making that one up!!).

Nevertheless, Fraud Week is an important event – one that is worthy of your attention as a leader in government.  Why, you ask?

Fraud Week goes well beyond bringing attention to the fraudulent behaviors we see in our society.  After all, who isn’t at least vaguely familiar with tax fraud?  With healthcare fraud?  With benefits fraud?

I propose that International Fraud Awareness Week serves three essential purposes for government leaders:

  1. IFAW promotes the ideals of integrity and ethical behavior in all of our daily activities.
  2. Fraud Week exposes those people and organizations who take advantage of others by committing any number of fraudulent acts.
  3. IFAW recognizes the men and women -- especially those in government -- who demonstrate their belief in honor and personal integrity by actively working to promote ethical behavior and to hold accountable those who choose to act unethically.

Perhaps now you understand why Fraud Week is worthy of your time and attention.

This week, I’d like to pay particular attention to some of the women and men who think strategically about fraud in government.  They have made great contributions to the cause of fraud prevention.  And, they have some very innovative, practical ideas for you to ponder.

Courtney Kay-Decker leads the Iowa Department of Revenue.  Read the story of how her agency is stopping tax fraud while maintaining excellent taxpayer service.

Jerome Bryssinck is a renown fraud leader who focuses on government and healthcare.  Take a few minutes to read his thoughts on how government leaders can combat customs fraud using big data and analytics.

To recognize this week's National Health Care Anti-Fraud Association (NHCAA) Annual Training Conference, John Stultz writes about how state government leaders can learn from recent large-scale fraud takedowns in Medicare and Medicaid programs. (If you're at NHCAA, go visit SAS at booth 213.)

We've got more ideas and articles that we'll publish throughout the week of November 14.  Stay tuned for even more great thinkers and great ideas!!

Post a Comment

Opioid schemes fueled by staged car accidents, insurance fraud

Even auto insurance fraud can be used to score opioids illegally. Image by Flickr user Images Money

Even auto insurance fraud can be used to score opioids illegally. Image by Flickr user Images Money

Criminal enterprises are tapping into the lucrative opioid business through creative schemes that are less likely to be identified as opioid abuse, misuse or diversion. One of the latest schemes? Auto insurance fraud. First, some background…

While extensive progress has been made in establishing, improving, and mandating prescription drug monitoring programs across the United States, criminal schemes are ever-evolving and becoming more sophisticate.  Despite countless efforts to distribute naloxone (an opioid antagonist), educate at-risk communities, and improve training and education on the dangers of prescription opioid addictions, the United States still faces rising unintentional overdoses involving opioids.

Years of policy changes, reforms and new funding has done little to curb this national crisis that affects citizens from all demographics. Prescription Drug Monitoring Programs (PDMPs), which have been successfully implemented in all but one state, provide medical providers insight into a patient’s controlled substance history across the state and, in some cases, even neighboring states.  These investments are critical to fighting   this widespread, long term fight with addiction, but are far from the only approach needed to start saving our afflicted.

Integrating relevant data sources allows for analytics to be used to spot trends and anomalies that can help us combat this epidemic. However, that integration is often prohibited by policy or law. Recent reports from the Massachusetts Department of Health show promise in overcoming these challenges, while still adhering to regulations and respecting privacy laws. Our ability to improve support, interventions and policy changes depends on a deeper understanding of the data, so we can empower those on the front lines of the battle.

Nothing illustrated the sometimes bizarre connections between data sources more than a recent story from my SAS colleague, Michelle Bergeron, a former Insurance Fraud Investigative Analyst. I was intrigued to discover that Michelle had experience with the opioid crisis through her work investigating insurance fraud.  Not only are criminals exploiting the medical system through diversion techniques (with doctor shopping or through pill mills), but they also were extending their acquisition of these highly profitable drugs through typical insurance fraud schemes.  I’ll let Michelle take it from here:

In 2010, one of my investigators noticed an increasing trend of staged accidents in the Louisville, Kentucky area. A common strategy to detect organized fraud activity is to analyze the attorneys and clinics involved across accidents, looking for commonalities.  We immediately recognized that there were 2 primary clinics and an attorney that were involved in the vast majority of accidents. 

Once we knew which companies were involved, we pulled the medical bills and invoices submitted for the accident claims.  There was an unusual component to the majority of the medical bills that made us dig into the data.  Almost every person involved in the “accidents” was written prescriptions for Vicodin, Ambien and Soma.  That’s an unusual combination for a minor auto accident. Additionally, almost all of the patients treated at the clinics received exactly the same pharmacological combination.  The treatment patterns, combined with the fact that over 90% of the policies were 3 months old or less at the time of the accident provided ample evidence to open a major case investigation.

Local field investigators took statements from the claimants and got details on their injuries and treatments.  During extensive interviews one of the participants broke down and admitted they were part of an organized staged accident scheme.  Participants were paid a flat fee of approximately $500 and the doctor and lawyer would submit bills on their behalf.  They would go to the pain management office and pick up their scripts shortly after being seen by the medical clinic.  They were allowed to resell the prescriptions on the street or keep them for personal use.

It was a simple way for low income individuals to make some extra cash and to support their personal prescription drug use.  How do we stop people from procuring unnecessary medication or selling it illegally?  This type of fraud is nothing new in the insurance industry and it still occurs today in many areas of the country.  Fortunately, investigators and analysts have great detection tools, such as SAS, that allow them to recognize patterns and anomalies in their data.  That allows them to gather evidence and turn the cases over to law enforcement for prosecution and help keep prescription drugs off the streets.

It’s both fascinating and disturbing to realize the creativity of the schemes criminal enterprises will use to obtain opioids. Reselling opioids on the street yields much higher margins than their illegal (and more dangerous) counterpart - heroin.

Unencumbered by red tape, data access issues or institutionalized processes, bad actors continue to be successful in finding different ways to avoid detection and maximize profit.  While these inequalities may seem daunting, many hard-working fraud investigators, analysts and state government leaders are onto these schemes. They are striving to identify not only the criminals, but the life-saving measures desperately needed to reverse the cycle of addiction and end this devastating epidemic.

Opioids are just one of the many topics being discussed at the National Health Care Anti-Fraud Association Annual Training Conference this week. Come visit SAS at booth 213!

Post a Comment

What can agencies learn from massive Medicaid fraud busts?

A huge Medicaid fraud bust by the US Dept of Justice can be instructive for states

A huge Medicaid fraud bust by the US Dept of Justice can be instructive for states. Image by Flickr user Eric Garcetti

On June 22nd, the U.S. Department of Justice announced the largest Medicaid fraud bust in history. The National Health Care Fraud Takedown included 301 defendants charged, $900 million in false billings, 61 medical professionals and 29 doctors, across 36 states.

In another case, investigators in New York uncovered more than $86 million in suspected fraudulent physical and occupational therapy claims to Medicare and Medicaid. Five individuals were charged for allegedly filling a network of Brooklyn clinics with patients by paying bribes and kickbacks. The clinics were controlled by the accused, and once there, patients were subjected to medically unnecessary therapy.

These massive fraud examples are often perpetrated by sophisticated, organized schemes. There is tremendous potential to crack down on health care fraud across the U.S. if federal and state agencies work together and share information about these complex fraud takedowns.

The great thing is, this could be done without sharing data, which is often prevented by law, policy or insufficient technology. There’s nothing stopping agencies from sharing the data elements that were useful in uncovering fraud.

For instance, in the New York example, some of the abnormal things that could prompt an alert when comparing a provider to its peer group include:

  • Average payment per beneficiary in the top 90th percentile
  • High number of unusually large payments within a provider or clinic’s patient population.
  • Average clinical severity of beneficiaries in the top 90th percentile. For example, a provider billing for a high number of urgent care procedures.
  • Average treatment time in the top 90th percentile, indicating possible padding of hours

Comparing behaviors across the provider population can detect who is possibly doing something outside of the norms of their peers.  A preponderance of suspicious events would help an analyst prioritize risk and focus their analysis on “the big picture”, as opposed to a single alert.

But investigators have to know that those abnormalities are good indicators of possible fraud. That’s the sort of intelligence that can be shared between agencies, without bumping into legal or policy restrictions.

I dive deeper into this topic in the recently published GovExec Route Fifty article Massive Medicaid Fraud Busts Offer Blueprint for States.

Are you a fraud investigator? What useful data elements have you uncovered that your peers should know about? Let’s use International Fraud Awareness Week as an impetus to share best practices and step up our fraud-busting strategies.

Also, if you're at the National Health Care Anti-Fraud Association Annual Training Conference, come see SAS at booth 213!

Post a Comment