An analytic approach to improving mental health access, quality and costs

Part 1: The challenge and the opportunity

Mental illness continues to profoundly affect the nation’s population and, for the most part, remains greatly under analyzed.  This is the first entry in a series about the mental health problem in the US, and how an analytic approach can improve care for the mentally ill and reduce the associated costs.

In 20121, an estimated 20%, or approximately 44 million, of adults aged 18 or older were living with a mental illness in the US. This does not include substance abuse related numbers which would drive the total higher.

The numbers do not differ much for children. A recent study from the CDC indicates that somewhere between 13-20% of all children living in the US had experienced a mental disorder in the previous year.  Within this mental health population only 38% of all adults, and less than 20% of all children/adolescents, are treated for their mental illness2.

Those who go untreated for mental illness bear a cost to themselves and potentially the rest of society. They maintain a high risk for a variety of unhealthy/unsafe behaviors such as suicide, alcohol/drug abuse, violent/self-destructive behavior and increased potential to be homeless or incarcerated.

The financial impact is evident. As of 2008, an estimated $60 billion was spent on mental health compared to $35 billion in 19963.  In addition, US employers every year lose almost $22 billion dollars from decline in production due to mental illness4. That number is a direct result of the estimated 217 million work days which are lost/partially lost from mental illness5.

The social impact further affects family/friends/work/school/public services and while much has been said in the news about violent crime and the mentally ill, in actuality about 3-5% of violent crime is attributed to someone with severe mental illness. In fact, severely mentally ill people are almost 10 times more likely to be a victim of violent crime than the rest of the population6.

While successful initiatives have been launched to create awareness and address mental illness in a more comprehensive manner, much more can be done to support and affect positive change across the entire mental health spectrum.  Integrating readily available data from clinical, claim and other government/commercial/social systems can lay a foundation for deriving valuable insight and target mental health initiatives surrounding:

  • Access to Care
  • Patient Coordination & Quality of care
  • Cost Containment

Applying advanced data management, analytics and visualization technology to this data could lead to prodigious care & payment delivery enhancements in the mental health areas listed above.

Ultimately, an analytic approach can lead to better outcomes not just in mental health but also in physical health and social services as well.  I’ll delve into access to care in Part 2.

References
1http://www.nimh.nih.gov/statistics/1ANYDIS_ADULT.shtml
2http://www.mentalhealth.gov/basics/myths-facts/index.html
3http://www.aha.org/research/reports/tw/12jan-tw-behavhealth.pdf
4http://www.aha.org/research/reports/tw/12jan-tw-behavhealth.pdf
5http://www.aha.org/research/reports/tw/12jan-tw-behavhealth.pdf
6http://www.mentalhealth.gov/basics/myths-facts/index.html
Post a Comment

Analytics making a difference: NYC finds Good Shepherd Services program reduces homelessness, jail stays

A recent project with a supportive housing provider in New York City showed how analytics leads to insights that can change, even save, lives.

The New York City Center for Innovation through Data Intelligence (CIDI) is the analytics research arm of the City’s Deputy Mayor for Health and Human Services (HHS). The Center’s goals are to improve coordination and quality of HHS services, inform citywide policy, analyze cross-agency policy issues and conduct independent research. CIDI partners with other agencies and organizations to tackle questions that can only be answered through data and analytics.

“Our partnership with CIDI has shown us the power of merging our program data with robust health and human service data.  We always knew that our Chelsea Foyer program worked, but now we have the results to back it up.”  - Sr. Paulette LoMonaco, Executive Director, Good Shepherd Services

Young people who are on the precipice of aging out of foster care, homeless, or are struggling with mental illness face unique challenges when it comes to successfully transitioning to adulthood. Therefore, they may require specialized forms of support to ensure they go on to lead more productive, focused and successful lives.

One promising intervention for these at-risk youth is supportive housing- programs that combine housing and targeted services.  However, experts have struggled to grasp the full impact of such programs due to the lack of availability and consistency of data on a wide range of outcomes, the inability to identify appropriate comparison groups, and other methodological challenges.

The Chelsea Foyer at the Christopher, developed by Good Shepherd Services, is an innovative, trauma-informed supportive housing program in New York City that serves 40 young adults between the ages of 18-25. These young adults are aging out of foster care, homeless or at risk of becoming homeless. Residents at the Foyer can live there for up to two years and access an array of youth development services, including workshops on life skills, finance, and employment to prepare them for independence.

Good Shepherd Services is a recognized leader in the development of innovative programs that make a difference in the lives of children, youth and families. Each year, Good Shepherd Services work with more than 26,000 New Yorkers through over 80 programs in the Bronx, Brooklyn and Manhattan.

Working together, Good Shepherd Services and CIDI were able to conduct research comparing the outcomes of Chelsea Foyer participants to the outcomes of a comparison group of individuals who were eligible for supportive housing but were not placed in a supportive housing program.

Controlling for other factors, CIDI found that Foyer participants were 36% less likely to have a stay in the single adult shelter system and 55% less likely to go to jail during a two year time period as compared with the comparison group.

The preliminary results from this evaluation have promising programmatic and methodological implications. The lower rates of homeless shelter and jail stays for Foyer participants relative to their comparison group peers suggest tremendous benefits to young adults in supportive housing. This study also exemplifies how administrative data can be used to track participant outcomes, even for smaller scale programs.

CIDI is a member of the University of Pennsylvania’s Actionable Intelligence for Social Policy network, a group of public sector and academic organizations that research ways to “improve the quality of education, health and human service agencies’ policies and practices through the use of integrated data systems.”

CIDI has flexed its research muscles well beyond the supportive housing project. Other projects include:

  • A longitudinal, comparative study of cross-over youth in child welfare and juvenile justice in partnership with two other cities. The study found that youth who entered foster care for the first time at the age of 13 is years is 9.0-times more likely of having juvenile justice involvement than a child placed as an infant. This helped strengthen the Administration for Children’s Services $23 million investment in specialized teen preventive services.
  • Data support for the Young Men’s Initiative that improved understanding of services aimed at decreasing racial disparities for young men of color.
  • Maps of service need and utilization in vulnerable neighborhoods, used to inform program and policy planning for cross-agency coordination efforts.

CIDI is not just making a difference in their own organization, they are helping other agencies make a difference and positively impact countless lives with the insight they provide.

This is the first in a series of posts profiling individuals or teams in state or local government that are using analytics to make a difference in their agencies and beyond.

Post a Comment

Advocating for a robust value-added implementation

Recently, the American Statistical Association (ASA) released a statement about value-added modeling. This statement was widely covered in the national press, some of which positioned the statement as a significant blow to value-added modeling. However, the ASA statement did not “slam” value-added modeling; rather, the statement’s authors advocated statistical rigor, responsible implementation, and expertise in providing the models. We at SAS agree with these principles, and the EVAAS models largely adopt ASA’s recommendations.

In response to such press coverage, the ASA clarified its intent in publishing the statement and what the statement actually recommended in one of its community blogs. The majority of the ASA’s blog is reposted here:

Last week, the ASA Board of Directors adopted an “ASA Statement on Value-Added Models for Educational Assessment.” What the statement says, and why the ASA makes such statements, are the topics of today’s ASA at 175 blog.

As noted in the ASA’s press release on the statement, use of value-added models (VAMs) has become more prevalent, perhaps because these models are viewed as more objective or authoritative than other types of information. VAMs attempt to measure the value a teacher adds to student-achievement growth by analyzing changes in standardized test scores. VAMs are sometimes used in high-stakes decisions such as determining compensation, evaluating and ranking teachers, hiring or dismissing teachers, awarding tenure and closing schools.

The ASA position statement makes the following recommendations:

  • Estimates from VAMs should always be accompanied by measures of precision and a discussion of the assumptions and possible limitations of the model.
  • VAMs should be viewed within the context of quality improvement, which distinguishes aspects of quality that can be attributed to the system from those that can be attributed to individual teachers, teacher preparation programs or schools.
  • The ASA endorses wise use of data, statistical models and designed experiments for improving the quality of education.
  • VAMs are complex statistical models, and high-level statistical expertise is needed to develop the models and interpret their results.

The story already has been picked up in several places:

These articles, and others that will appear, reflect the controversial nature of this issue, and the ways position statements such as this one are interpreted by a writer’s position on the matter. One education writer wrote us an extremely cranky note about how “spineless” the VAMs statement was. He used several more adjectives to describe his displeasure.

The ASA is not in the business of determining educational policy, but we are very much in the business of promoting sound statistical practice. Our descriptive statement about the ASA notes that we promote “sound statistical practice to inform public policy and improve human welfare.” This statement on VAMs urges people to think carefully about the uses of these models and to engage with statistical experts, because such models require expertise to use correctly. Especially when the stakes are high, it is sensible to ensure decisions are made based on proper data and analysis. That’s what we as statisticians bring to the table.

We also agree with ASA that value-added reporting can provoke intense emotions and that some of the press coverage reveals those authors’ own feelings – or their lack of expertise in interpreting technical issues. Value-added models can be complex, and they should be complex in order to address many of the concerns that educators have about using student testing data. My colleague Nadja Young discusses that in more detail here. However, these political opinions should not supplant what is known from two decades of high-quality research performed by statisticians and economists, such as:

  • Teaching matters. The differences in teaching effectiveness have a highly significant effect on the rate of student academic progress.[i]
  • Teaching matters a lot because ineffective teaching cannot be compensated for in future years. Teacher effects were found to be cumulative and additive with very little evidence of compensatory effects.[ii]
  • Students’ background does not matter in terms of their progress. In robust value-added models, students can make significant progress regardless of their race or ethnicity.[iii]
  • Good teaching is not just about an increase in test scores. Teachers’ value-added effectiveness is correlated with student success in other areas, such as college attendance, income, etc.[iv]

There is a legitimate debate about the appropriate role of value-added analysis in educational policies, which is evident in the myriad ways that states and districts have used this reporting. It’s important to have these discussions with full understanding of research and quality implementations. ASA’s statement is a step in this direction, not a step away from the use of value-added modeling.


[i] Douglas Staiger and Jonah Rockoff, “Searching for Effective Teachers with Imperfect Information,” Journal of Economic Perspectives 24, no. 3 (Summer 2010), 97–118.

[ii] Sanders, William L., and June C. Rivers (1996). Cumulative and Residual Effects of Teachers on Future Student Academic Achievement. Knoxville: University of Tennessee Value-Added Research and Assessment Center.

[iii] Lockwood J.R. and D.F. McCaffrey (2007). "Controlling for individual heterogeneity in longitudinal models, with applications to student achievement." Electronic Journal of Statistics, Vol. 1, p. 244.

[iv] Chetty, R., Friedman, J. N., & Rockoff, J. E. (2011). The long-term impacts of teachers: Teacher value-added and student outcomes in adulthood (No. w17699). National Bureau of Economic Research.

Post a Comment

Beyond health data: Alternative data sources could give unprecedented view of patient health, costs

The healthcare big data revolution has only just begun. Current efforts percolating around the country primarily surround aggregation of clinical electronic health records (EHRs) & administrative healthcare claims.  These healthcare big data initiatives are gaining traction and could produce exciting enhancements to the effectiveness and efficiency of the US healthcare system. They can provide much needed transparency about the cost and quality of care, enhance coordination of care efforts, provide better outcomes and mitigate overall cost of care. However, they are only the building blocks for a much larger and progressive data universe.

Our data rich world offers an exciting opportunity to leverage sources of information from all aspects of life: genomic, nanotech, social media (Facebook, Twitter, smart phone technology apps, personal medical devices, shopping, travel, weather, political, employment, and so much more).  Linking this data and converging it with current clinical EHR and administrative claims initiatives could yield a truly unique 360 degree view of patients, unlike anything previously seen.

Data from other highly impactful areas such as fully sequenced DNA and socio-economic factors could really make a difference.  In many cases this is the data, largely untapped in the healthcare world, which can impact our own health but is rarely factored into the coordination of care.  Understanding how our health is affected, either positively or negatively, by the world around us should be of importance to us all.

Undoubtedly, there are a great many challenges which lie in the path of achieving this level of interaction between data. There are obvious political and personal privacy issues. Resolving those is essential to anything this comprehensive becoming a reality.

However, for the purpose of this blog, I will focus on the technical challenges associated with such an effort.   Successful insight into healthcare cost and quality depends on the correct use of technology to extract, load, cleanse, enrich, analyze and report on these data.

In future blogs I will explore a specific use case: Me. I will use my own personal health data for use in research and analysis. I will use a variety of data sources, ranging from Clinical EHR’s, Administrative Claims, Social Media, travel, weather, mobile phone applications, and more.  The data will be integrated together to make a 360 degree view of my own personal health history within a specific time series.  The process of integration, matching, cleansing, and enriching will be discussed.  In addition, the final integrated data set will be analyzed using data visualization, exploration and reporting technology.

The results will help to identify a number of issues such as identifying trends in co-ordination, quality and cost of care, and the potential factors which were impactful. By doing so, we will determine if there are adjustments we could make in our own lives which would enable us to enhance our health and control our healthcare.

 

Post a Comment

South Carolina teacher evaluation system supporting professional growth

Today it is common knowledge that a classroom teacher is the single largest in-school influence on student academic growth[1].  So when South Carolina received ESEA flexibility in July, 2012, the State Department of Education immediately began an initiative empowering teachers to increase their own effectiveness. Known as the Educator Evaluation System Pilot, it included value-added measures for 23 School Improvement Grant (SIG) schools in 2012. In 2013, the pilot expanded to include new observation rubrics and value-added measures for 46 schools across the state.

These teacher evaluation tools provided educators with more meaningful feedback, not only to measure their performance, but also to improve upon it. Oftentimes value-added data is presented as a single number. How can that help teachers grow professionally and improve their practice? South Carolina went beyond providing a sole value-added rating for teachers to provide a comprehensive reporting system that includes reflective, diagnostic, and predictive tools for teacher and school improvement. This type of data “toolkit” is critical to better support teachers’ professional growth through differentiated professional development, meaningful evaluation conferences and goal setting.

Mona Lise Dickson is a Principal at Lady’s Island Middle School in the Beaufort County School District. She expressed that once her teachers were given the time to understand the new evaluation system, they invested themselves and began to improve.

“EVAAS indicated to us that teachers were having a hard time with their lowest achievers, and that some gifted kids were losing ground once they arrived in middle school, so we focused professional development on the skills needed to reach those kids. Additionally, if we have two teachers, one that is value-added Level 5 and the other a Level 2, we can pair them. The collaboration improves the Level 2 teacher. EVAAS helps teachers look at data as a lens for student growth. Teachers can look at students compared against themselves, so they can really see how much impact they’ve had.”

This concentration on student growth resulted in increased student achievement on the state PASS test. The percentage of students at the “Met” or “Exemplary” performance levels increased by 10.6% in 6th grade English Language Arts and by 15.7% in 6th grade Science. According to Principal Dickson, this improvement resulted from school leaders and teachers taking ownership over their impact on student growth and sharing it in a culture of trust.

“We’ve had Master Teachers come in to explain to their colleagues how to use EVAAS. Many teachers embrace the opportunity to grow themselves and their students. They are open to changing the way they teach. They see the growth and internalize it. One teacher said they were teaching in the dark until EVAAS helped show them their weak areas.”                                         

Identification of those weak areas informs professional development. When principals spend scarce resources on professional development, they want to make sure they are giving each individual teacher what he or she needs most. EVAAS diagnostic reporting helped Principal Brenda Romines do just that. At Bell Street Middle School in Laurens 56 School District, Principal Romines tailors professional development and backs-up the observation ratings she gives teachers with student growth evidence.

“When I have conferences with teachers, I share not just how they’ve done the past year, but over a progression of years. I can show them where they are effective with certain populations of students. For instance, if a teacher shows growth with high-achieving kids, but less growth with low-achieving ones, we can look into what factors may contribute to that. I can also use that information to place teachers where they can have the greatest impact on student learning. This isn’t about dismissing teachers, it’s about how we can improve instructional practice. What does the analysis tell teachers about their classroom? What can they do to give their students a year’s worth of growth? EVAAS helps them see areas of strength and weaknesses with students and make a professional development plan that promotes growth of students and themselves.”

Data literacy has become a systemic part of not only evaluation, but also school improvement, at Bell Street Middle School. Educators pour over value-added and other data sources in quarterly meetings. Principal Romines noted how educators even share these data in conferences with students so that each child can understand how they are growing.

“I ask my teachers to invest their heart and soul into the use of data. We have individual conferences to determine what a teacher can do differently, and we’re constantly going back to the data to refine what we look at as a school and teacher. The data help me have difficult conversations. Sometimes we have weaker teachers and we try to pair them with a stronger one. We don’t want a child to sit in two weak teacher’s classrooms. These are tough conversations, but the kids are worth it.”

With exceptional growth as a leading indicator, improved student achievement came next at Bell Street Middle School. In just one year, from 2012-13, the percentage of students achieving “Met” or “Exemplary” performance levels on the state PASS tests (across all grades and subjects) increased by over 7%.

These are two exemplary schools, but one may wonder how the larger group of pilot districts performed as compared to the State. Student growth is reported by five effectiveness categories. In English Language Arts, Math and Science, there were 13.4% fewer pilot schools in the lowest effectiveness category in 2013 compared to 2012. It is reassuring that the intensive supports provided to these pilot schools did appear to improve student outcomes in the first year and that participating educators recognize value-added reporting as a useful part of the evaluation process.

Part 2 of this blog series will take a deeper look at how teachers can improve their value-added performance, or impact on student growth, by using these data to place greater focus on individual student needs.


 

[1] Marzano, R. J. (2010). Developing expert teachers. In R. J. Marzano (Ed.), On excellence in teaching (pp. 213-245). Bloomington, IN: Solution Tree.

 

Post a Comment

Projecting prison resources in North Carolina with analytics

In 1990, the North Carolina General Assembly created the Sentencing and Policy Advisory Commission to evaluate sentencing laws and policies and recommend any modifications necessary to achieve policy goals. As part of the mandate, the General Assembly required the Sentencing Commission to develop a correctional population simulation model. The model would be used to analyze any proposed change in the sentencing laws to estimate the impact on the inmate population.

Previously, the state enacted changes to sentencing laws without hard data about what correctional resources would be necessary to support those changes. Without enough resources to support the policies, prisons became overcrowded, offenders were released after serving only a fraction of their sentences and the public lost confidence in the criminal justice system. Lesson learned.

At the General Assembly’s behest, the Sentencing Commission began employing a model that used empirical information (conviction and sentence data) from the previous year to project future resource needs. The model could be used to project the overall prison population and the impact of proposed policy changes on that population. These projections became the guideposts for correctional funding and resources.

When the General Assembly or Secretary of the Department of Public Safety considers changes to the sentencing laws and policies, the Sentencing Commission is responsible for projecting the impact of the proposals. The Commission is required to apply the model to every bill and resolution introduced that propose any change in criminal law. For instance, changes to laws that increase penalties for violating existing laws or criminalize additional behaviors can lead to a net increase in the number of persons incarcerated and the duration of their incarcerations. Every legislative session, the Commission produces numerous impact projections. The General Assembly uses these projections to determine whether to pass a bill and appropriate more funds for prison resources or to make adjustments to the bill.

The Sentencing Commission also uses the correctional population simulation model to provide prison population projections. These projections, updated annually, give the General Assembly and the Department of Public Safety an estimate of what the prison population will be for the next ten years assuming there are no policy changes. This allows the State to plan for future criminal justice needs.

In 2011, the General Assembly passed the Justice Reinvestment Act, which made a number of substantive changes to the sentencing laws in North Carolina. It made the process more complex by adding some new sentencing options and changing the way the courts used some existing options. The Sentencing Commission needed a model that was more transparent, and would allow staff to make adjustments to take into account the variety and complexity of the changes.

The Commission, already using SAS for managing and analyzing data, turned to SAS for help with designing a new correctional population simulation model. The SAS Advanced Analytics Lab and SAS Operations Research Center of Excellence adapted the software for the new model. The programming was visible, allowing Commission staff to make adjustments as needed, aided by SAS’ ongoing support.

To date, the Commission has used the SAS model to produce two sets of prison population projections, which were published in February 2013 and February 2014. Initial results indicate that the projections are within the Commission’s historical accuracy range of 2%. Not only are the projections reliable, SAS has automated much of the process.  In the future, the Commission hopes to adapt the model to project resource needs for other justice populations, particularly juvenile justice resources.

The work of the Sentencing Commission reinforces the importance of making decisions based on data. Reliable projections help policymakers understand the resource demands associated with their proposals. Having adequate resource needs to support criminal justice policies and to maintain public safety are key to bolstering the public’s confidence in the criminal justice system.

To learn more, please check out the Sentencing Commission’s 2013 SAS Global Forum presentation, Projecting Prison Populations with SAS Simulation Studio.

 

Post a Comment

Teacher effectiveness culture shifts in Lubbock ISD schools – Part 3: The Superintendent

This is part 3 of a blog series on how Lubbock Independent School District (Lubbock ISD) uses SAS® EVAAS to improve teaching and learning by promoting self-reflection and aiding instructional and administrative decision-making.

This is done in a district that, in the past decade, has experienced dramatic increases in the percentage of Hispanic and African-American students, and the percentage of students living in poverty. These shifts precipitated a focus on meeting the unique needs of all students, whether low, middle, or high achieving. In 2009, Lubbock ISD joined a group of 10 Texas districts, which has grown to 23, using EVAAS for value-added reporting.

We've heard from the teachers and principals. Our final perspective is from a Lubbock ISD superintendent who used EVAAS in her doctoral research.

For a district administrator, analytics enable independent teaching effectiveness research

A 25-year veteran educator in LISD, Dr. Kathy Rollo is currently the Associate Superintendent for Lubbock's elementary schools. Her doctoral research, using EVAAS, expands the knowledge base of how and why teachers improve their practice as measured by their value-added teacher effectiveness measures and what role the campus principal played in that documented growth. By examining eight teachers with consecutive growth in their value-added estimates from 2010-2012, Dr. Rollo examined the following research questions:

  1. What changes did teachers who improved in their value-added effectiveness scores make in their instructional practice?
  2. What professional learning facilitated the changes in instructional practice made by the teachers?
  3. What role did principals have in the professional growth of the teachers who improved in their value-added effectiveness?
  4. In what ways did teachers who improved in their value-added effectiveness use the data to inform instruction?

Dr. Rollo’s findings are vast and her dissertation warrants a thorough read. Without the space to dig into all of the details here, some concluding findings are below:

“The teachers attributed learning to the use of humor in the classroom and building their own capacity to use multiple resources, including their own creativity, with their professional growth.  With regard to professional development, teachers appreciated training that was both practical and included active learning.  Ongoing collaboration with colleagues, both formally in professional learning communities and informally through casual dialog, was also considered important to their progress. The teachers valued trust and recognition from their principals and the fact that principals were visible in their classrooms on a regular basis.”

It is noteworthy that the teacher with the highest amount of consecutive improvement in EVAAS measures “understood the data and used it to guide instructional practice.  She was also diligent about reflecting upon every lesson and determining what had worked and what had not worked in order to improve student learning in the classroom.” Dr. Rollo deemed it “most important that districts and schools ensure that principals and teachers know how to use value-added data to drive instructional decisions in order to obtain maximum growth in effectiveness.”

An example to follow

As Texas begins to incorporate student growth data into educator evaluation systems, Lubbock ISD educators provide a wealth of experience to share in how to accomplish this positive culture shift. These data can become a part of instructional and administrative decision-making as well as facilitate research for continued school improvement. We know that teaching is more than a job. It’s a life devotion that deserves ever-improving sources of support and insights to meet the needs of today’s dynamic learners.

Post a Comment

Teacher effectiveness culture shifts in Lubbock ISD schools – Part 2: The Principals

This is part 2 of a blog series on how Lubbock Independent School District (Lubbock ISD) uses SAS® EVAAS to improve teaching and learning by promoting self-reflection and aiding instructional and administrative decision-making.

This is done in a district that, in the past decade, has experienced dramatic increases in the percentage of Hispanic and African-American students, and the percentage of students living in poverty. These shifts precipitated a focus on meeting the unique needs of all students, whether low, middle, or high achieving. In 2009, Lubbock ISD joined a group of 10 Texas districts, which has grown to 23, using EVAAS for value-added reporting.

On Friday, we heard from the teachers. Today, it’s the principal’s turn.

For principals, value-added data help to ‘work smarter not harder’

LISD has many school improvement initiatives within which EVAAS is just one piece of a very large puzzle. One offering receiving a lot of attention is the pre-Advanced Placement equivalent courses in middle and high schools. With support from a grant, LISD trained teachers to more effectively teach students in advanced courses and significantly increased student enrollment in these courses. However, principals still needed to make the decisions as to which teachers should teach these advance courses, and which students are ready to take them. Enter Heidi Dye, Principal at Hutchinson Middle School and former 15-year teacher. Principal Dye uses teacher value-added and diagnostic reporting to identify teachers’ strengths and manage to them. She can determine not only which teachers are most highly effective in those tested pre-AP subjects, but can also determine the teachers who impact more-than-expected growth with high-achieving students.

“I’ve made changes in teaching assignments based, not solely, but in part on our value-added results. It really does tell me which teachers are doing what we are asking them to do in the classroom…Some teachers are better at one thing than another and when we sit down and have personal conferences, we talk about those specifics. When you’ve got the numbers in front of you it’s a whole lot easier to have those crucial conversations...It gives me the backing that I need to make those decisions…As a faculty we have common goals in mind for our students and it isn’t about us, it’s about the students. So when we have a tool like [EVAAS] to use for the betterment of our students, then we’re going to take that and run with it. I think you have to work smarter, rather than harder a lot of the time.”

It appears that this strategy of data-driven teacher and student course placement has paid off at Hutchinson Middle School. The state of Texas evaluates schools in cohorts of like achievement and demographic makeup. In a cohort of 40 schools that look like Hutchinson, they were ranked number one. But for other administrators outside of LISD who may be unfamiliar with value-added data, a culture shift may be required to trust in the data for administrative decisions. Principal Dye’s advice for those new to EVAAS in TX is to:

“Use it. Don’t just disregard it. Take it apart and figure out what it means. Once you do that, it’s not nebulous. That’s for sure. I trust it as a data source because it’s never come back and really surprised me greatly. Every time I’ve looked at the data, I’ve thought to myself ‘Yeah, this is about right’.”

The educators at Hutchinson Middle School illustrate how a high-achieving school can ensure high student growth. Some may wonder whether low-achieving schools can also show high growth. Amy Stephens is the current Principal at Wright Elementary School, but spent five years as Principal of the recently consolidated Bozeman Elementary School. Bozeman was a low-achieving school with high student mobility, 98% free and reduced-price lunch, and about a 50/50 makeup of African-American and Hispanic students. Bozeman used to focus most intently on classroom management due to high teacher turnover, said Principal Stephens.

“If you had good classroom management, you were viewed to have good instruction. Looking at student engagement and student learning was really not what was happening. The information in EVAAS was vital to us turning Bozeman around. We had a lot of ground to make up with our kiddos, and a lot of times our state test scores did not reflect the huge gains that we knew we had made. EVAAS data helped us look at teachers and have some small celebrations and, in some cases, some big celebrations. That really helped boost my teacher morale and keep them engaged. I believe, firmly, that it helped us not lose as many teachers. It helped us keep it positive and keep them going in the right direction. I speak for my specific campuses in saying that our teacher evaluation process up until this point in time has been a ‘check in a box.’ Everybody thinks they need to be an ‘Exceeds’ in that box whether that ‘Exceeds’ translates to student learning or not. I think having this is very eye-opening and will cause more reflection.  I am excited about having this piece in the future evaluation process - not any kind of a ‘gotcha’ but to start changing that culture to a growth model and what are we doing to ensure all students are learning.”

Please come back tomorrow to get the superintendent’s views. Thanks for reading.

Post a Comment

Teacher effectiveness culture shifts in Lubbock ISD schools – Part 1: The Teachers

Improving teacher effectiveness is no simple task. Whether a part of a formal evaluation system or for formative feedback, looking at student growth data can be a valuable part of the development process for teachers and administrators. Lubbock Independent School District (Lubbock ISD) uses SAS® EVAAS to improve teaching and learning by promoting self-reflection and aiding instructional and administrative decision-making.

This is the first of three blogs inspired by feedback from teachers, school and district administrators, highlighting how EVAAS is helping Lubbock ISD educators do a better job at the most important job of teaching Texas’ children.

Keeping a close eye on student growth data is particularly important if a school district is adapting to rapidly changing student populations. Lubbock ISD has experienced  large demographic shifts the past decade. The population of Hispanic and African-American students has increased to a majority, and the Anglo population has decreased to 30.6%. There has been an 11% increase in students living in poverty, bringing the total to 65.3%.

These shifts precipitated a focus on meeting the unique needs of all students, whether low, middle, or high achieving. In 2009, Lubbock ISD joined a group of 10 Texas districts, which has grown to 23, using EVAASfor value-added reporting

This first post will focus on the work of two remarkable Texas teachers.

For teachers, small group focus yields big growth for struggling students

Robin Fulbright spent 22 years teaching 5th grade math at Murfee Elementary School before joining the professional development department at Lubbock ISD’s central office. Ms. Fulbright’s former principal called her, “the most amazing teacher that you’ll ever talk to. She is some kind of awesome and she has incredible, incredible value-added scores.” Ms. Fulbright attributes these results to strategic small group creation and instruction.

“My teammates and I were fairly meticulous in gathering data from EVAAS, looking at STAAR (State of Texas Assessments of Academic Readiness) test scores from the previous year, and having conversations with each child’s parents and previous teachers…We would wrap all that together early in September and would use that to make some small group plans. We shared our children equally to provide extra support in reading, math, and science, our three priority subjects. We divided up our small group instructional time. We would have different shared instruction happening in all three classrooms. We would then use our formative assessment data and our six week tests to move children who needed more support into those small groups. Children who were showing mastery and growth would move out. It was very flexible all year long.”

The Texas Education Agency is now piloting a version of EVAAS called TxVAAS as a part of the Texas Teacher Evaluation and Support System (T-TESS). Since Ms. Fulbright has voluntarily used EVAAS to inform her practice for four years, she has encouragement to share with those who may be unfamiliar with value-added information.

“I (historically) had the highest STAAR scores in the district for my grade and subject, while teaching advanced Level 3 students. But last year, my value-added scores were not as high as previous years; not as strong as I would have liked them to be. My students’ STAAR test scores remained awesome though. I would not have known my children didn’t make the projected growth as high as I would have expected or liked without the value-added reports.  (EVAAS) caused me to pause and reflect on what I could have done differently or would do in the future. I trust the process. Did I really pull out my small groups like I should have? Could I have used that time more effectively? Did I neglect my higher achieving group and worry more about my lower-end group?  That reflection is the key to growing and improving.”

Michelle Watts is a 13-year English Language Arts teacher at Hutchinson Middle School, a high-performing, culturally diverse arts magnet school. Four years ago, Ms. Watts “was completely blown away” with her EVAAS data. She thought she had been more effective in growing her struggling learners. In reality, she demonstrated greater effectiveness with middle and high achieving students. This information influenced instructional changes the next school year. She added after-school tutoring to serve those kids who needed more practice but also made changes to her in-class delivery.

“I use different wording. Sometimes I simplify what I’m saying or speak more slowly. These are small technical adjustments that can make a difference in the way that I approach those students. Because it’s not that they can’t learn it, it may be that I’m just going too fast for them or using words they’re unfamiliar with.”

Ms. Watts hones her craft in planning for incoming groups of students who have similar academic profiles to the students reflected in her most recent EVAAS reports. She repeats the practices that were successful with certain groups of students and can improve on others that may have missed the mark in growing particular subgroups.

“What EVAAS did for us was provide that starting point, a jumping off place to begin my work with students…[It also] validated what we were doing so we could move forward and get better at it. I know I’m not the best teacher on the planet, but I know I’m good at what I do, I love what I do, and I love the kids. But when we got this tool I could say, ‘I worked my tail off and this proves it.’ That makes me feel like I’m making a difference.”

Come back Monday to read how two Lubbock principals are using EVAAS to put teachers where they and their students can have the most success.

Post a Comment

Top rated value-added school: Extreme test prep or well-rounded experience? A student’s perspective

A recent Charlotte Observer article provided a thoughtful investigation of growth and achievement in North Carolina’s Charlotte Mecklenburg Schools). The article juxtaposed two very different, yet highly effective, schools. The first, Ranson Middle School, is a low-achieving school with 84% poverty that demonstrated the highest academic growth of any similar school.

The second school, and the focus of this blog, is Ardrey Kell High School, a high achieving school with low poverty and the highest student growth in CMS.   (Note: Public reporting on this academic growth in NC can be found at: http://ncdpi.sas.com)

I have offered my thoughts in a past blog about teachers narrowing their curriculum, teaching to the test, or using “kill and drill” test-prep techniques. However, I wanted to get a student’s perspective. So I interviewed a fabulous former student of mine, Kelsey Williamson, who recently graduated from Ardrey Kell HS and is a freshman at North Carolina State University, to find out her opinion. I want to explore, through a student’s eyes, a high-performing school in a state where value-added measures (VAMs) are used as a part of school accountability and teacher evaluation. Did she feel additional pressures as an unintended consequence at AKHS?

Nadja: What is so special about AKHS that contributed to their impressive student growth and achievement?

Kelsey: The teachers and the staff. The rigor of classes is no joke. I didn’t know this until I came to college and started talking to my friends about what their AP classes were like. Many of my college friends said their teachers told them “not to bother” taking AP exams since “no one will do well.” At Ardrey Kell, however, many of us left the AP tests feeling confident and our scores showed it!

Nadja: How did it feel to be a student in such a rigorous, high-performing school?

Kelsey: I loved Ardrey Kell and I was aware that AK was preparing (and even over-preparing) me for college. Ardrey Kell gave me more opportunities to push myself and thrive.

Nadja: Did you experience a lot of “kill and drill” test prep?

Kelsey: I did not feel like AK was intensely focused on test scores. For example, every day in my Biology class we did some sort of lab, visual, illustration, in-class project, you name it! Some examples included making little booklets and models or acting out processes in skits. What I didn’t realize was how much I was learning without having to go home and memorize pages of notes. I was learning without hours upon hours of studying. Not only that, what I learned has stuck with me. As a Biology major, not a day goes by when I don’t need to reference something I learned in my AP Biology class. I was able to get credit for freshman Biology and have been adequately prepared for both next-level courses: Anatomy and Genetics.

Nadja: What did teachers do differently to push all students (at all achievement levels- low, middle, high) to grow so much?

Kelsey: I do believe that the key difference is empowering students by setting reasonable expectations. I hear so many horror stories about teachers from other schools who don’t teach and then give insanely hard tests that bring students great frustration. Those students feel like success is out of their control. When teachers teach well and give fair (not easy, but fair) tests and assignments, students are more empowered to put forth effort. They know that the effort will be shown in their grade, and that is a wonderful feeling. Every teacher offers some type of after or before school help for students who need it. Never once have I felt like I had no way to find the answers or help with confusion.

Nadja: Aside from teachers and instruction, what did you appreciate about Ardrey Kell programs and leadership?

Kelsey: The A day/B day schedule is genius. Being both similar to college courses and allowing more time for learning and studying, I felt like all curriculum was better understood because of this schedule format. This allowed for year-long AP classes where I successfully earned 37 college credits. I cannot praise Ardrey Kell without mentioning Principal Switzer. Teachers respect him immensely and students love him too. This man genuinely cares about students and it shows in how well he knows most everyone. If Principal Switzer was coming my way in the halls, I knew I was going to receive a “Hi Kelsey! How are you?” A student suggestion box in the library allows students to have a voice in school improvement and monthly “Coffee with Switzer” events let select students sit down with him and talk further. He attends many sporting events, and works closely with all student-led organizations giving them lots of approvals to do different activities, events, etc. What an honor it is to be able to write such praise about my high school, and truly mean it!

I was pleased to hear this kind of feedback from Kelsey that not only can a school be high achieving, but it can also push all students to grow (a common misconception I’ve written about here.) Furthermore, in order to make this growth happen, a school can still be well-rounded and foster a love of learning, sports, arts and clubs. As the Ranson MS administrator stated in the highlighted article, “Tests are never the primary goal. That comes when graduates succeed in college or the workforce.” I’m proud to see Kelsey doing just that!

 

Post a Comment