In reading and listening to our recent series of stories with The Cleveland Plain Dealer about how Ohio is using a statistical measure called value-added to measure whether teachers provide a year’s worth of learning to their students, many of you have asked, “Just how is value-added calculated?”
The Plain Dealer’s Patrick O’Donnell explains.
The goal is to find whether a student made a year’s worth of academic growth in a school year, but there are many different methods you could use. Ohio’s system is intentionally complicated.
The simplest way would be to take a student’s test scores on the Ohio Achievement Assessments one year and see if scores were higher or lower the next. But that’s too simple, say experts on the subject.
This series about valued-added, a new way that Ohio is using to measure whether teachers provide a year’s worth of learning to their students, is the result of a partnership between The Cleveland Plain Dealer and StateImpact Ohio. StateImpact reporters Molly Bloom and Ida Lieszkovszky worked with Plain Dealer reporter Patrick O’Donnell and Plain Dealer data analysis editor Rich Exner to produce these stories.
- Overview: Using Data To Evaluate Teachers
- Pay vs. Value-Added Performance
- Secrets Of Two “Most Effective” Teachers
- Value-Added’s Poverty Factor
- How is Value-Added Calculated?
- Audio: Measuring Performance Through Growth
- Audio: Push for Performance Pay
- Video: Guide to Ohio’s New Way of Evaluating Teachers
SAS Inc., the company Ohio hires to calculate value-added, uses a few regression models that combine multiple variables in its Education Value-Added Assessment (EVAAS). SAS officials say that it takes a complicated formula to account for enough variables to be fair.
The models essentially take all students’ scores and place them on a curve, or, similarly rank them in percentiles. A student’s test scores the following year are ranked and placed on a curve the same way.
In its simplest terms, to meet value added targets, a student will appear at about the same place on the curve or score at the same percentile from year to year. If they’re lower on the curve or have a lesser percentile rating, they do not meet value added. If they’re further along, they score above it.
Those scores are combined to create value-added ratings for teachers and for schools.
Starting in 2010-11, Ohio began linking teachers with their students. Teachers are supposed to receive “credit” for their students’ value-added scores in proportion to how much time they spend teaching each student. For example, two teachers who co-taught a single class might each receive credit for half of each student’s value-added score.
SAS has published technical overview papers describing the model used to calculate Ohio’s value-added scores, but declined to release the exact model and data-processing steps used to calculate the scores. The Ohio Department of Education doesn’t have a full copy of the model.
Concerns about value-added being a “black box” were also discussed in these Plain Dealer articles in 2011, along with other issues that critics raise about it. One showed that value added was playing a growing role in evaluating Ohio schools and teachers and the other that Ohio teachers would soon be graded on students’ academic growth.
With StateImpact Ohio reporter Molly Bloom