ESSA – accountability, indicators and analytics to drive informed decision making


Last December, The Every Student Succeeds Act (ESSA) was signed into law to ensure opportunity for all students in the United States. As part of this federal legislation, states now have the flexibility to design their own accountability systems following certain parameters outlined in ESSA. These accountability systems include academic and non-academic indicators. By using analytics and dashboards to monitor these indicators, states can discern how well students, schools and districts are performing, and help to identify where changes need to be implemented, allowing them to course-correct ineffective programs.

To understand more about the the new accountability systems and what SAS is doing to help education agencies and states with the analysis of their selected indicators, I interviewed Emily Baranello, Vice President SAS Education Practice, Susan Gates, SAS Special Advisor on Education and Nadja Young, Senior Education Manager, SAS State and Local Government Practice.

As part of ESSA, states must include more indicators in their accountability systems than were required under No Child Left Behind or U.S. Department of Education regulations. Can you explain?

Susan Gates: Each state must include at least three “academic” indicators, such as test scores, English language proficiency, graduation rates, and student growth, and at least one “non-academic” indicator, such as access to AP courses, teacher qualifications, and school climate. These new systems can be directly correlated with each state’s ultimate education goals for their students, such as ensuring college- and career-readiness, allowing interventions as early as possible to keep students progressing.

Nadja, Susan mentioned student growth. Can you explain why it is often discussed as an effective academic indicator?

Nadja Young: A focus on growth acknowledges that, academically, students don’t all start the year at the same place. While students’ test scores (a common academic indicator) measure what students know at a given point in time, student growth measures how much they have learned over a period of time, such as a year of schooling. Not all students can become high achieving – or even proficient – by the end of the year.

But regardless of their entering achievement level, all students can grow.  At the very least, they can maintain their achievement level relative to their peers and not fall further behind. That is what we must expect of our schools. By including a sound value-added or growth measure in accountability systems, policymakers put the emphasis on something educators can influence. While teachers and schools cannot control the entering achievement of their students, teachers and schools can control how much learning, or academic growth, takes place within a school year. This makes growth a more fair measure of school quality for accountability purposes.

What other indicators are states considering for their new ESSA accountability systems?

Emily Baranello: ESSA gives states the flexibility to go beyond the definition of school performance into areas outside of the classroom. For example, new non-academic indicators such as school climate, school safety, adverse punishment, or grit. The Connecticut Department of Education, with whom we partnered, wanted to provide a more comprehensive, holistic picture of student and school performance over time. We worked with them on their new “Next Generation” accountability system which includes 12 indicators. In addition to the more traditional academic indicators, their new system also incorporates academic growth over time, as well as, additional key indicators, such as chronic absenteeism, physical fitness and arts access. These indicators can be sorted among subgroups of students, providing a broader and clearer understanding of how those students are progressing so that they can be better supported on the path to success.

Susan Gates: Emily mentioned chronic absenteeism, which has become increasingly important. A new analysis by the U.S. Department of Education’s Office of Civil rights found that more than 6.5 million students – or about 13 percent in grades K-12, missed 15 or more days of school in the 2013-14 school year. Chronic absenteeism disproportionately affects specific subgroups, particularly homeless and foster youth, as well as minority students and students with disabilities. With ESSA’s new emphasis on understanding how students in these numerous subgroups are progressing, absenteeism is a key indicator among others to follow.

Emily Baranello: And to add, chronic absenteeism is definitely something that states are looking to include, which is typically defined as missing 10 percent of the school days. This data can be easily integrated and reported on almost immediately, allowing for interventions with at-risk students.

Nadja, Susan mentioned indicators for specific subgroups. Can you tell us more about that?

 Nadja Young: With ESSA, states are still required to report on their school quality indicators at the student subgroup level, including three new subgroups: homeless, foster, and military-connected. Reporting on these historically vulnerable subgroups will give states and districts a clearer picture of how—or whether—the needs of these students are being met and how to mitigate the unique challenges of measuring the academic growth of these students. Homeless, foster, and military-connected student subgroups include a higher proportion of high-mobility students, missing test scores, and smaller student sample sizes than many other subgroups—all of which can hinder the ability to measure their academic growth if using simplistic models. Ways to address these challenges can be found in this Education Week article.

Emily, with all of the data on indicators, how can states use school indicator data for decision making?

Emily Baranello: The biggest challenge for states is determining what data is available today and, ultimately, what data they would like to capture in future years. For data readily available, it can be analyzed alongside all academic and non-academic indicators a state wants to include in its system. If it is not being collected today, a plan for collecting and sharing that new data will need to be put in place.

Once states have identified what they want to include – currently and in the future – how can the indicator data be analyzed in a way that is instructive to educators, parents and students?

Emily Baranello: Once states have incorporated all relevant data – academic and non-academic indicators, including disaggregated data on student subgroups – they not only can track that information over time but also analyze that information to understand patterns and trends. By using interactive dashboards, users can easily monitor these indicators to better understand student and school progress. These dashboards empower users to visually interact with data and discover insights. As such, they’re able to make more accurate, data-informed decisions and course correct as needed.

Going back to our work with Connecticut, feel free to check out their data portal, EdSight. This new portal successfully integrates information from more than 30 different sources. Information is available on key performance measures that make up the Next Generation Accountability System, as well as dozens of other topics, including school finance, special education, staffing levels and school enrollment.

Susan, what is the timing for all of this work?

Susan Gates: At this point, the timeline laid out in ESSA remains in place. These accountability systems need to be up and running for the 2017-18 school year. As states grapple with this work, it will be interesting to see if additional time will be granted. Stay tuned!

I enjoyed learning more about the flexibility states have in selecting indicators for their accountability systems, and I can see how the reporting and analyses of these indicators will help improve academic outcomes for all students.

To learn more, check out this report about how SAS can help states with ESSA accountability. Also, check out to learn more about what SAS does for P-12 education.


About Author

Georgia Mariani

Principal Product Marketing Manager

Georgia Mariani has spent nearly a quarter-century exploring and sharing how analytics can improve outcomes. As a Principal Industry Marketing Manager at analytics leader SAS, supporting the education industry, she passionately showcases customers using analytics to tackle important education issues and help students succeed. Georgia received her M.S. in Mathematics with a concentration in Statistics from the University of New Orleans.

Comments are closed.

Back to Top