School, teacher, student data: Where do we grow from here?

Over the past few months, many US states and districts have received data about student growth and teacher effectiveness. Some educators experience the excitement of outstanding scores and, most importantly, the success of their students’ growth.  Some quietly plug along, satisfied to be meeting growth targets and deciding if it isn’t broken, there is no need to fix it.  Still others are frustrated, left with unexpected and disappointing results.  Regardless of results, everyone should take a long, hard look at the data and together ask the same question:  Where do we grow from here?

First, it’s critical to understand the difference in growth and achievement. Achievement is what is measured by a test score. Growth is determined by comparing students against themselves and the amount of progress they have previously made across all tested grades and subjects. Knowing and understanding what best defines expected growth, however, is just the beginning.  Leaders must be more reflective and proactive as they evaluate data to make decisions about how to grow from here.

Start with a high-level view of school data

As part of that reflection, SAS® Education Value-Added Assessment (EVAAS®) for K-12 reports provide teachers with an excellent starting point for contemplating the improvement process.  Using EVAAS information, teachers should begin with a high-level view, where they can use broad scatterplots  to examine multiple variables.  At this level, teachers can easily see growth as compared to overall proficiency, as well as see growth and proficiency as compared to an entire state.  Educators can see the growth of various groupings within a school, such as students with special needs or different socioeconomic status.

This high-level view yields certain insights. At a school-wide level, are students growing?  Are various subgroups of the school growing and/or changing?  The answers may at first seem to be a simple “yes” or “no”; however, the leader in all of us demands that we then ask, “Why is this happening?” and “What does it mean?”  More specifically, we can ask, “How can I use this data to guide school-wide professional development goals?”

For example, if I am leading a middle school and I look at the Math value-add data for grades 6, 7, and 8, what do I see at first glance?  I may see that my 6th grade math is not meeting expected growth, while my 7th grade is exceeding growth and 8th grade is meeting growth.

The differing outcomes provoke new questions. Were there curriculum changes in 6th grade that necessitate more teacher support for implementation?  What is 7th grade doing well to grow students more than was anticipated?  Can we/should we perform instructional rounds to observe this grade level? How can 8th grade continue to grow and improve? With a quick glance at each grade level and subject, we can begin to hone in on where we are and how we grow from here.

Descend to a teacher-level view

Diagnostic reports for individual teachers reveal how each teacher is growing the lowest-, the middle- and the highest-achieving students.  From here, education leaders can evaluate each teacher’s unique needs and customize his/her professional development, with a goal of continuing improvement. In so doing, we promote and foster professional growth and improve the likelihood of increased student growth.  Why send teachers to professional development that is not specific to their needs?  If we target specific areas for improvement based on the diagnostic data, we better support our teachers and ultimately our students.  Furthermore, using teacher diagnostic data can help us build a better master schedule.  With the right students in the right seats in the right classroom, we can see even more effective teaching and improved student growth.

Finish with student-level information

The final piece is student information.  What can we learn about our students? Predictive analytics helps put the right students in the best courses for their future success.  For example, EVAAS data can help determine if the appropriate students are enrolled in advanced math courses that are gateways to their college readiness. By looking at student diagnostic data, paired with teacher diagnostic data, we can determine the best fits to promote student growth.

More than a number

We have often heard teachers remark, “I am more than a number.”  Surely, that is something with which we can all agree, and now we have the tools to make it so.  As we continue to consider teacher effectiveness, student growth and EVAAS, let us use the newly released data in a positive way.  Fortunately, EVAAS provides us with more than a single measure of a single test to help us bring growth over time -- growth of school districts, individual schools, each teacher, and each student -- into its sharpest focus yet.  With the tension and the excitement of the data release (and news media focus) now largely behind us, we must seize the moment and turn this challenge into a rare, limitless opportunity.

Where will we grow from here?

 

 

Post a Comment

Are APCDs just what the doctor ordered?

With all the changes the Affordable Care Act brings, including new care and payment models, there is an increase in provider’s need for data.  While some large health systems are able to learn much about a patient’s full course of treatment by integrating their systems, the majority of health care is provided by smaller, less integrated systems. Regardless of size, all systems can benefit from a state claims database, commonly referred to as All-Payer Claims Databases (APCDs), which are now available in 11 states.

Take for example something as straight forward as a knee replacement. It requires many visits and can vary in treatment.  The full course of treatment could include: patient’s primary care visit with the referral to the surgeon, knee surgeon visit, tests to determine a replacement is needed, pre-op testing, surgery, follow up doctor’s visits, prescriptions, post physical therapy, applicable readmissions, and so on.

What about someone who has a chronic condition like diabetes that spans over time? How complex could this be to track if many of these facilities and providers aren’t linked?

The state of New Hampshire is giving physicians, providers and other parties the full view of a patient’s path with their Accountable Care Project, a program out of the NH Citizens Health Initiative. Previously, they sent out a 700-page PDF that had to be searched manually. Now, the Initiative provides participating organizations web-based interactive reporting and ad hoc search capabilities to access the information they need from the state’s APCD and other sources. Providers glean insights into patient care, outcomes and costs that were not possible before. From the UNH website:

Sharon Beaty, the CEO of Mid-State Health Center in Bristol and Plymouth, says the Accountable Care Project gives her organization a much broader perspective on the quality of care it is providing. Because the data includes all claims for patients (even if they received some of their care elsewhere), she can examine trends in use that she would not have been able to see otherwise. For instance, she can look at the percentage of diabetics who received retinal exams—a measure of whether they are receiving proper care. Or she can determine if patients are following up on their providers’ recommendations to get mammograms.

She can also compare the average monthly costs of caring for similar groups of patients at her facility versus elsewhere in the state. “Before we couldn’t tell what value we were providing the system because we had no data to gauge it from,” she says. “Now we do.”

Many of these data bases contain health insurance claims information from both public and private insurers.  Because it is claims information, providers would be able to see all the visits a patient had for their knee replacement or their diabetes care, whether or not they were seen by a facility or a physician in their organization.

Over my 25 years in health care, I have heard many health systems, hospitals, and physicians affirm the potential value of claims data.  Not only does this information provide a more complete view of a care path, but its impact would increase exponentially if states offered web based reporting with query capabilities to the providers. Having the state create the infrastructure, improve the data and develop reporting capabilities once versus each provider network creating its own would save substantial health care dollars.  In addition, this model would offer the capabilities to many small organizations who would never be able to do it on their own.

Hopefully great examples like NH’s Accountable Care Project will spread throughout the states showing the real value these data bases can bring.  What is your state doing, if anything, to improve care and reduce costs by integrating claims data?

Post a Comment

Calling all High School STEM educators! Teach Your Students 21st Century Computer Science Skills

STEM skills are essential for many of the fastest-growing and most lucrative occupations. And SAS programmers are in high demand in all fields.

A number of reports have documented a critical talent shortage, especially for graduates with advanced degrees in math, computer science or computer engineering. (See Running on Empty, Report to the President, and various reports from the National Science Foundation).

If you’re a math or computer science teacher, SAS wants to partner with you to help solve this problem. We’re providing high school teachers with training and materials to teach SAS – for free. We are literally giving away computer science course offerings, training and materials, and I want to make sure our education community is aware.

Participating teachers will join a growing network of educators who are providing students with an advantage for their college and career aspirations. By participating in either of the two summer workshops described below, teachers will be qualified to teach a full course in SAS programming, a highly sought-after skill that students can apply in college courses and in many job situations.

 

SAS® Programming Institute for High School Educators Apply Now

This train-the-teacher model teaches teachers how to prepare data for analysis and write SAS programs to solve problems. Ultimately, they will be equipped to teach the following high school courses:

SAS Programming 1 for High School – this course teaches students basic SAS programming concepts and tasks, including accessing and manipulating data; producing basic list, summary, and statistical reports; creating SAS data sets; combining SAS data sets; creating basic graphs; and querying data using the SQL procedure.

SAS Programming 2 for High School – this course covers comparisons of manipulation techniques and resource cost benefits designed to help student programmers choose the most appropriate technique for their data situation. This course also teaches students how to process SAS data using Structured Query Language (SQL) and how to use the components of the SAS macro facility to design, write, and debug macro systems that are reusable and dynamic. Emphasis is placed on understanding how programs with macro code are processed.

Institute Dates: June 23-27, 2014
Location: SAS World Headquarters, Cary, NC

Through this five-day program, teachers will receive:

  • All teaching materials,      exercises, data sets and assessments needed to teach a 15 week block      schedule course or a year-long traditional period length course.
  • Use of SAS software at no cost      for classrooms.
  • Support from SAS software      trainers and curriculum designers.
  • Opportunities to foster      relationships with other SAS programming instructors nationwide.

Prerequisites:

  • Teacher or school offers at least one other programming language in addition to SAS (C++, Java, Python, Visual Basic)
  • Students have taken on other programming class before SAS (C++, Java, Python, Visual Basic)

AP Statistics using SAS® Apply Now

AP Statistics teachers will learn how to reinforce the concepts needed for the AP exam using SAS® Enterprise Guide®, a point-and-click interface to the power of SAS. This technology will illustrate and expand upon the statistical concepts taught through hands-on, interactive exercises.

Workshop Dates: August 4-6, 2014
Location: SAS World Headquarters, Cary, NC

By attending this three-day workshop, teachers will receive:

  • All teaching materials, exercises, data sets and assessments used in the workshop.
  • Use of SAS software at no cost for classrooms.
  • Support from SAS software trainers and curriculum designers.
  • Opportunities to network with other AP stats instructors.

You can’t commit to three or five days for these workshops? Check out other ways to take advantage of SAS’ education philanthropy here.

Post a Comment

"March madness" of student course enrollment gets assist from value-added assessment

As teachers head into the madness of student course registration, the madness of college basketball reinforces a critical point: Data is crucial to making the picks that lead to a winning bracket, and student growth. Value-added assessment has proven reliable in determining which students are ready for their "one shining moment".

This is one of my favorite times of the year.  I love the excitement of the NCAA tournament, the thrilling finishes, the Cinderella stories… I enjoy the bracket challenges, despite my lack of success. My son tells me to “Think, Mom!” because he knows I pick my teams based on emotion.  I admit, I pay no attention to team data. My son, on the other hand, is all about the numbers.  He knows the stats, the teams’ strengths and weaknesses and the positional match-ups that decide games. His bracket defeats mine every year!

The brackets reminded me of the 8th grade registration packet my son recently had to fill out.  He circled Algebra I on his course guide. It was no surprise as he loves math, and it is one of his best subjects.

As a parent, it was easy for me to sign off on his choice, but I wondered how we, as teachers, select courses for our students?  Teachers know their students well, and many of these recommendations have been successful, but do we sometimes let our emotions and subjectivity guide our recommendations?  Do we worry we may have held back a student whose potential we may not have recognized?

Today, reliable analytics such as EVAAS can help us be proactive in student course enrollment that can promote student growth.

A few years ago, Wake Forest Rolesville Middle School decided to use value-added assessment to determine which students should be enrolled in Algebra I as 8th graders.  Like many schools and districts, its policies around enrollment relied on math placement as early as 5th grade, and on teacher recommendation.

In the first year, WFRMS tripled enrollment in 8th grade Algebra 1, from 50 to 150. 100% of those students passed the course. You do that in your bracket pool and Warren Buffet gives you $1 billion!

The effort is ongoing in five large North Carolina districts, where Algebra 1 enrollments have doubled and tripled, and proficiency rates are above 95 %.

According to a report from the U.S. Department of Education, “Taking advanced math courses in high school was more strongly associated with successful completion of college than any other factor including high school grade point average and socioeconomic status.” Adelman, 1999

As we begin to plan for next school year with our students, let's use the data.  Data provides insight that our human biases can miss. Our students deserve every opportunity to be challenged and to be supported as they work to grow academically. We chose to enter our profession to change the lives of our students.  Now, more than ever, we have data to help drive the art of teaching.

I could still use someone to explain the Mercer-Duke game to me, however.

Post a Comment

Student growth measures can be the bridge to new assessments

Courtesy of Jessica Dobbs at DeviantArt.com

As I embark on 2014, I reflect upon the many competing, yet interdependent, tensions discussed in education circles in 2013. In conferences, classrooms and statehouses, adults who care about kids debated the best ways to implement:

  • New academic standards (Common Core State Standards or other College and Career Ready Standards)
  • New curriculum and technology-rich instructional resources
  • New assessments aligned to the new standards
  • New educator evaluation systems based in-part on academic achievement and/or growth on the new assessments
  • New school accountability systems based in-part on academic achievement and/or growth on the new assessments

In fact, eight out of ten of Education Week’s Top 10 State K-12 Blog Posts in 2013 fell into the above categories. The tension I hear most frequently as a former teacher is the connection of new teacher evaluation systems to new Common Core aligned assessments, as described well by Andrew Ujifusa: Common Core and Evaluations: Are Teachers 'Going Crazy'?

Since coming to SAS, I now understand how this tension and seemingly valid concern is rooted in common confusion between achievement and growth. As I have come to better understand growth models, I believe some of them can be the bridge we need to actually gauge student performance across standard and assessment changes.

Student achievement cannot be reliably compared from an old test to a new test. Educators and Ed Policy folk all expect that student achievement (test scores) will drop at the onset of more rigorous assessments that measure students against more rigorous standards. If teachers feel their evaluations will be tied to these lower test scores (achievement), they have reason for concern.

However, many evaluation systems rely more heavily on student growth measures than achievement. Student growth can be reliably compared from an old to new test. Some student growth measures already incorporate a wide range of assessments across grades and subjects that change over time. They look at students’ position in a statewide distribution from year-to-year to make apples to apples comparisons. While proficiency rates may drop, teachers and schools can still show high growth. For example, high growth could be shown when a group of students moves from the 50th percentile on the old test to the 55th percentile on the new test. Reliable growth/value-added measures can serve as this bridge.

The sky is not falling. This is not the first time states have changed standards or assessments. Student growth has been measured for over 20 years in Tennessee across many of these changes. See how in the blog post, Transitioning value-added and growth models to new assessments. Tennessee’s Commissioner of Education, Kevin Huffman, clearly established this point at a recent Annual Policy Forum put on by the Council of Chief State School Officers:

Everyone assumes that with new tests, achievement will be low, and then value-added will be low. That’s not exactly how it will play out. We now need to communicate that effectively….A few years ago, we used to have 90% of kids proficient on state assessments, but then performed very low on NAEP. We already raised the bar in 2010 (by establishing more rigorous standards and assessments) and we went from 90% to 30% proficiency overnight….There was initial noise and frustration, but then it stopped and everyone survived. By the time I got to TN in 2011, people weren’t even talking about the lower test scores, they were talking about- how do we now get better?”

And indeed Tennessee ‘got better.’ Public TVAAS reporting provides scatterplots of growth and achievement data that show achievement and growth rates improving since 2010. In November 2013, Tennessee showed the largest academic growth on the 2013 National Assessment of Educational Progress (NAEP) of any state, making Tennessee the fastest improving state in the nation. Leaders in Tennessee know that raising the bar and revealing initial lower achievement is a necessary step in the improvement process. It’s the right thing to do for kids. States that are about to embark on this work in 2013-14 should keep in mind that this has been done before, lessons can be learned, and the sky is not falling so long as we have reliable growth measures to serve as our bridge across assessments.

 

 

Post a Comment

Taking local governments from Performance Management to Strategy Management

Performance management systems are becoming more important to local governments across the country. This is true for several reasons.

  • Citizens are calling for a more accurate accounting of how their tax monies are being spent.
  • Local government revenues have not been growing as much as in the past and, in some cases, declining.
  • Citizens are demanding better service delivery at lower costs.
  • There is a general demand for more transparency in the activities and decisions of local government.

The result has been increased interest by local government managers in better understanding how their organizations and employees are performing.

Performance management systems are organized around developing Key Performance Indicators (KPI) related to various public services. For example, the number of tons of solid waste collected, tons collected per collection route, number of Part I Crimes solved, percentage of crimes committed that are solved, etc. In addition, performance management systems trackKPIs over time to identify trends in the various service areas and look at improvement over time with additional resources or new programs.  These systems allow local governments to see how the organization is performing over time and make better decisions about the allocation of resources.

But these systems can be difficult to sustain across an organization. The key to creating and maintaining them, and applying them to all services, is to have the right technology. Data management and analytics can integrate data from all applications systems (Computer Aided Dispatch, Records Management Systems, Work Order Systems, Financial Systems, PC based data, etc.) and track these data sources over time, identify trends in performance, and predict patterns of future performance.

The next step is for local governments to translate performance management into strategy management. This requires the organization to relate the data gained from enterprise wide performance management to the overall strategies of the community.  For example, if a community has an overall public safety strategy to become the safest city in the state with a population over 100k, do all of the public safety KPIs and goals and objectives fit that that strategy and how are they performing? The only way to be able to map the KPIs, goals and objectives to the overall strategies is through the use of technology designed to identify and track all of the data and to relate the appropriate data to the appropriate strategy.

The progression from performance management to strategy management moves a community closer to reaching their goals, not just collecting data that verifies the water is still running and the lights are still on.

Post a Comment

New teaching challenges may show your personal best is yet to come

As I crossed the finished line, I could feel the tears welling up.

“Don’t do it," I thought. "Athletes don’t cry."

Somehow, I managed to pull myself together, but instead of my usual post-race celebration of high fives and cheering on other runners, I walked to the race result board without eye contact with anyone.  My instinct was confirmed--my worst time ever.

I wanted to be proud that I at least finished the race after significant time away from running, but I wasn’t.  I had not met my goal.  And then I began to rationalize--everything.  Let's see. I had not been running in months.  With a new job, busy children, and everyday chores, there was no time to train.  I am rapidly approaching forty, have put on some weight and often skip breakfast.  All right, I admit it; the old gray mare, she ain’t what she used to be! Well, no wonder I had a less than desirable time. I should be glad that I tried at all.... Wait!

Suddenly, I realized I was doing the unthinkable--justifying mediocrity with excuses.  Rather than owning up to reality, I was willing to accept my unacceptable time and chalk it up to a plethora of unreasonable reasons.  At that moment I made a choice.  I wrote my time down in my journal with a note that read, “This was your time today.  Now what?”  I set a new goal.

In the past, I knew that I achieved success by asking questions, researching training plans that worked for me, and being reflective.  I started doing all of those things again, but still, results were elusive.  It was so much more difficult to stick with the plan because of all the changes I have experienced both personally and professionally over the past couple of years.  I had to find a way, so I did something completely out of my comfort zone.  I sought help and asked to start running with a group.

Mind you, I am a people person to the core, except in running.  Running is my time, and group running puts every insecurity I have out there for other more skilled runners to see.  But, working alone, I was not seeing success in my training plan during this time of transition, so, reluctantly at first, I ran with runners.  Some were faster.  Some were more competitive.  From each, I learned something new and important about recovery and re-commitment to the sport.  I found comfort and confidence in knowing that others struggled as I did.  Most importantly, I found myself being reflective, learning from others, and appreciating help that I needed.  I found accountability.  I found support.  I found the runner I had buried beneath a pile of excuses.

As I sat down last evening to catch up on my educational research and readings for the week, I started thinking about how many changes to curriculum, pedagogy, and evaluation are occurring for educators.  Just then, one of those “light bulb” moments we talk about as teachers happened.  Just as I had experienced great transition over the past couple of years, so has the teaching profession I love.  The transition into Common Core, new assessments, and more accountable evaluations has taken center stage.  Through blogs, media, conferences and conversations, teachers are working diligently to develop new training plans for teaching to the Common Core State Standards.

As teacher evaluations are increasing teacher accountability, educators are feeling that same fight back the tears feeling I experienced at the race finish line.   Amazing teachers who have always been effective are struggling in ways they may never have known.  I thought about the lessons learned from my recent running breakdown.  Things change.  Transition is difficult.  Finding your groove through all of this is challenging, exhausting, and at times, heartbreaking.  It becomes too easy to make excuses and avoid accountability that may never again be what it once was.

We fear that we are ill-equipped to run the race of new standards and new measures of teacher effectiveness.   We do what comes naturally to us; we begin to rationalize and justify.  We may feel we are running in the wrong direction.  We may feel that we taught the best possible lesson, only to find that assessment results prove otherwise.  We may doubt our effectiveness.  We may curse the new assessments, and tears well up.  Instead of thinking about all the reasons why we can’t, shouldn’t we accept the reality that change is here and resolve to do something positive?

Perhaps we should think about how to be more reflective and learn from the struggles we may face through the first year of implementation.  Perhaps we need to come out of our comfort zone, seek help, and work with colleagues and instructional coaches to build a better training plan.  Looking at real data, whether we like it or not, helps us to know where to start to rebuild and grow.

The race is on. After all, isn’t learning, reflecting and growing what we expect most from our students?

 

Post a Comment

Lost in translation: Interpreter business rife with fraud

Another day, another scam defrauding insurers and governments.  For purposes of full disclosure, the case I'm highlighting today comes from Washington's Labor and Industries (L&I), the agency where I formerly worked and headed up fraud prevention efforts, and the investigation dates back to my time there.  During my time there, I saw so much fraud and abuse from interpreter agencies that I once said I would give a reward for anyone that could find an honest one.  That's hyperbole, but it reflects the exposure that workers' compensation insurers, Medicaid and Medicare face from providers, particularly those with lower credentials and huge risk for fraud.

The latest case from L&I comes in at $600,000 in value.  The owners of Hispanic Voices, the firm providing services (sometimes) and billing (all the time!), used a mix of schemes.  They ranged from phantom billing for services never rendered, billing two or three times as long as real services took, and billing for services provided by non-certified interpreters (aka grab your cousin and start interpreting).

Other cases I've seen showed all sorts of bad behavior.  Billing for providing interpretation services from the same interpreter in multiple locations at the same moment (very impressive!).  Charges for significant travel time to provide the service which never occurred.  Some of the more interesting schemes involved recruitment of non-English speaking workers in smaller communities to file false workers' compensation claims, with the same primary care providers and interpreters billing and everyone splitting the ill gotten gains.

Okay, enough bashing of interpreters.  There are plenty of other high risk provider categories and treatments out there - durable medical equipment, physical therapy, chiropractors, spinal cord stimulators and the like.  While lower in some specialties, known fraud cases range the gamut of virtually every type of procedure and provider type.  So, what to do about it?

Beyond increasing requirements for licensing, background checks and the like, the real opportunity is to ensure that data sets are analyzed appropriately.  Bouncing billing time for services rendered by the interpreters off the the addresses of the patient and doctor, as well as the services provided by the doctor, begins to show a range of what may be likely on an individual billing.  Going further to analyze the number of licensed interpreters versus billing hours, which can regularly show more than 24 hours worked in a day helps as well.  Good analysis of patient populations, need and risk continue to expand analysis.  For government agencies, matching against other data sets - like revenue/tax base, ownership and employee reporting quickly begin to show the gaps that reflect the fraud.

These approaches work regardless of the type of provider.  However, beginning with the areas of biggest risk can help significantly.  One state, working with SAS, began to segment Medicaid billings, and started with high risk areas.  As an example, non-emergency medical transportation was a significant cost.  When properly analyzed for risk, outliers and with predictive models based on past cases utilized, they were able to save tens of millions of dollars annually.  From just one type of service!

The best approach is to use multiple data sets, multiple approaches and ensure that fraud detection is continuous monitoring, not just ad hoc runs.  The latter produce successes, but fail to guard against future risks, serving more like fraud Whack-a-Mole than a true plan.

Post a Comment

Bold efforts, hard work pay off for Tennessee education, and my daughter

With the others, I filed into the school gymnasium, my super zoom camera lens at the ready and a nervous smile on my face. Across the room, I caught a glimpse of my unsmiling daughter, and my apprehension grew about how this awards day program would play out for her.

My daughter does not love school, and the past few years have been an academic struggle for both of us--the tears, exhaustion, frustration, laborious tutoring sessions and below average grades. Several times we both felt ready to give up.  As a mom, I hurt for her and I have propped her up with all the nurture and encouragement I can muster.  As a veteran teacher, I found myself falling short.  I enlisted help from her teachers, her principal and an excellent tutor.  Together we worked diligently.  It has been a continuous and challenging journey to say the least.  I sat there waiting for the program to start.  Wondering.  Pondering.  Reflecting. Hoping.

Then, I did what all working moms do--checked my phone one last time before the program.  An email --“NAEP (National Assessment of Educational Progress) student growth results”--caught my eye. The anxiety and uncertainly I had about my daughter's progress multiplied. This was the Nation’s Report Card! I was most curious to see Tennessee’s results because educators there had chosen to take a remarkable risk and implement the Common Core State Standards a year ahead of other states.

I had talked with teachers in Tennessee about the countless struggles they had worked through over the past few years.  They told me of the angst they experienced as they grappled with their professional desire to exceed effectiveness, newly enacted assessments and evaluations, and the difficult task of implementing new, more rigorous standards.  Teachers told me they were often exhausted, frustrated, overwhelmed, and sometimes ready to give up.  At the same time, caustic blogs bombarded Common Core, TVAAS, evaluations, and new initiatives.  Critics questioned whether drastic transition in both curriculum and accountability was going to improve education for Tennessee students.

I pictured those teachers and education leaders throughout Tennessee waiting apprehensively before scanning the NAEP results. Wondering. Pondering. Reflecting. Hoping. Had the struggle been worth it? Then I saw it. U. S. Secretary of Education Arne Duncan had specifically mentioned Tennessee as the state showing the most NAEP student growth in the nation.   He noted that Tennessee has shown continuous growth over the past three years on the TCAP (Tennessee Comprehensive Assessment Program).

After all the challenges of implementation and of policy changes, Tennessee has taken a giant leap forward in student growth.  Using tools like TVAAS has helped teachers and administrators more effectively use the data available to focus not only on proficiency but also on growth.  Teachers have worked together not only to implement new standards but also to help each other grow professionally.  The Tennessee Department of Education has worked to provide support and accountability and to ensure student growth and effective teaching.  Their methods and their progress have both been validated.

Although I live in North Carolina, I cannot help but beam with pride for Tennessee. You put students first, made bold choices and charged ahead of the nation. And you have shown us all that student growth is possible through change, and in spite of it. Thank you, Tennessee, Awesome job! Was it and will it continue to be a challenging journey?  Most definitely.  Was the anguish you experienced along the way worth it? Absolutely.

As the awards program in the school gymnasium ended, I glanced across the floor at my daughter. She was smiling about school for the first time in a very long time.  After so many challenges, obstacles, tutors and endless homework help, she made her first B in Math and she almost made the Honor Roll.  Oh yes, she also received an award.  Okay, it was for perfect attendance, but it's a start.  Awesome job!

Post a Comment

To share or not to share? The tightrope of fighting government fraud

Data.  Google uses ours every day, and most people aren't concerned.  When our government is looking over our shoulders, however, tensions rise quickly.  On the one end lies the recent scandals with the National Security Agency (NSA), which is apparently spying on you, me, and Angela Merkel.  On the other lies case after case of failure to watch the gates, and letting billions of dollars of fraud, waste and inappropriate payments go out the door.

The total size of the problem is really unknown.  However, there are a number of studies that try to put things into perspective.  According to the federal government, the 3-year rate of erroneous payments nationally sits at 7.1%, down from 9.4% just a couple of years ago.  The goal for 2013 is 6.4%.  However, put into dollars, that is $19 billion going out the door that shouldn't be.  Unemployment is even worse, with the latest annual study from the U.S. Department of Labor pegging the national rate at 10.8%, although they claim an unrealistically low 2.85% are actual fraud.  Unlike the Medicaid numbers, this study doesn't hide the results from individual states, which ranged from a low of 3.7% in Vermont to a high of nearly 22.8% in Pennsylvania.  Shockingly, a study of the SNAP (food stamps) program puts error rates in the low single digits for every state, and just 2.77% nationally for 2012.

Not surprisingly, most of the states I've talked to dispute those federal studies, particularly the detailed numbers in the unemployment report.  Earlier in the year, one state pushed back at legislative efforts to address fraud in the Medicaid program, stating that there was "no proof" that they "even have a problem".  Maybe the study in Medicaid should go down to the state level as well?  Almost all of these are definitely understating the problem.  Truly knowing the rate of fraud and abuse is difficult.  Most experts in the field and economists that study it peg it at 10% or higher.  Malcolm Sparrow of the John F. Kennedy School of Government at Harvard, has written and spoken about these issues at length.  He has testified before congress about the weaknesses of evaluative methodologies in use in health care, particularly in the Medicaid and Medicare programs, and noted that fraud could be 20 or 30%.

Clearly, a balance must be struck between constant monitoring of the populace at large for a threat that may or may not exist, and failing to mind the gates.  Somewhere in the middle is an approach based on risk.  Individuals receiving significant funds from government programs, regardless of which programs, present a risk for fraud and abuse.  If you, or I, aren't collecting unemployment, or food stamps, or welfare, or on Medicaid, we aren't in the expenditure column, and shouldn't have all our private information poured over.  On the other hand, if you are choosing to receive government benefits, or are a business or medical provider billing significantly to provide services in those programs, an exposure exists.  Some basic level of analysis for fraud and risk should be expected.

In fact, the better the analytics running such a program, the less intrusive it is on the consumer or business.  Banks operate the same way - the vast majority of our credit card transactions go through very easily, and we don't receive annual audits, because of good proactive detection operations and link analysis to connect people, locations and payments, along with looking at past patterns.  When something looks like an anomaly, we receive a phone call to verify a transaction.  Government can and should operate the same way - reduce the number of intrusive audits and investigations, handle most transactions seamlessly, use a light touch with early interventions, and then hit the real risks hard.

One of the critical paths to doing this well is sharing data across programs.  Seeing the view of an individual from the property they own, car they just licensed, business they own and comparing that to whether they should be receiving food stamps is critical.  Viewing provider networks to see when false billings for medical equipment are happening, or a grocery store is trafficking in foot stamp cards protect from much larger exposures.  This also protects the rest of us from being victims of identity theft for fraud of government benefits or false tax refund filings, which now generate 43% of all complaints from identity theft in the U.S.

Some positive steps are happening.  Earlier this year, the IRS announced a change that would allow matching of federal tax filing data with any program that receives even partial federal funding.  That benefits broad programs run by the states - everything from welfare to unemployment to Medicaid.  A change in federal rules went from prohibiting state Medicaid Fraud Control Units (MFCUs) from utilizing any form of data mining to detect fraud to specifically allowing it, and federal matching funds to cover 75% of the cost of doing so.  The fact that a prohibition ever existed on doing the right thing, as opposed to manual referrals from Medicaid staff or the public is so backwards, only government could have thought it up!

However, despite increasing support from the federal government and easing of rules, at the state level, privacy concerns and very strict interpretations of laws continue to hamper efforts to bring data together.  Some states have made more strides - Washington State uses an integrated business identifier, and employs sharing across a number of different agencies.  California has an integrated task force that not only shares data, but does joint enforcement actions.  A law passed there this year lets the data sharing go much further.

The key is for managers of agencies across the nation to see the value in minding the store, and push hard to follow the laws, protect data from breaches and public exposure while at the same time improving sharing to gain a broader and more accurate view of recipients of services.  Long-term, this can not only protect against fraud, waste and abuse, but also lead to providing the right mix of services to gain the best outcomes from programs at the lowest cost.  After all, that's why these programs exist in the first place, right?

Post a Comment