Improving teacher effectiveness is no simple task. Whether a part of a formal evaluation system or for formative feedback, looking at student growth data can be a valuable part of the development process for teachers and administrators. Lubbock Independent School District (Lubbock ISD) uses SAS® EVAAS to improve teaching and
Author
A recent Charlotte Observer article provided a thoughtful investigation of growth and achievement in North Carolina’s Charlotte Mecklenburg Schools). The article juxtaposed two very different, yet highly effective, schools. The first, Ranson Middle School, is a low-achieving school with 84% poverty that demonstrated the highest academic growth of any similar
STEM skills are essential for many of the fastest-growing and most lucrative occupations. And SAS programmers are in high demand in all fields. A number of reports have documented a critical talent shortage, especially for graduates with advanced degrees in math, computer science or computer engineering. (See Running on Empty, Report to
As I embark on 2014, I reflect upon the many competing, yet interdependent, tensions discussed in education circles in 2013. In conferences, classrooms and statehouses, adults who care about kids debated the best ways to implement: New academic standards (Common Core State Standards or other College and Career Ready Standards)
As student growth or value-added measures become more prevalent in educator evaluation systems, many question how those ratings actually help teachers improve their practice. i.e. “How does a level 3 teacher become a level 4 or 5?” Robust and reliable value-added data serve as a great starting point for teachers
Students with missing test scores are often highly mobile students and are more likely to be low-achieving students. It is important to include these students in any growth/value-added model to avoid selection bias, which could provide misleading growth estimates to districts, schools and teachers that serve higher populations of these
Welcome to Part 3 of the value-added Myth Busters blog series. I have heard a variation of this many times. “Why shouldn’t educators just use a simple gains approach or a pre- and post-test? They can trust simpler methodologies because they can replicate and understand them more easily.” Simple growth measures
Welcome to Part 2 of the value-added Myth Busters blog series…have you heard this one before? Educators serving high-achieving students are often concerned that their students’ entering achievement level makes it more difficult for them to show growth. “How can my students show growth if they are already earning high
In the past five years, value-added models have been increasingly adopted by states to support various teaching effectiveness policies. As educators make the paradigm shift from looking at only achievement data to incorporating growth data, many misconceptions have developed. Compounding this issue is the fact that not all value-added and
As the holidays approach, we’ll all have some down-time to catch up on personal and professional reading, hopefully cozied up by a fire with a cup of hot chocolate in hand. While most books regarding data-driven decision making and value-added analyses can be pretty heavy, I’d like to suggest two