Ohio

Eye on Education

How is Value-Added Calculated?

An algebraic representation of the value-added model used in evaluating Ohio teachers.

SAS Inc.

An algebraic representation of the value-added model used in evaluating Ohio teachers.

In reading and listening to our recent series of stories with The Cleveland Plain Dealer about how Ohio is using a statistical measure called value-added to measure whether teachers provide a year’s worth of learning to their students, many of you have asked, “Just how is value-added calculated?”

The Plain Dealer’s Patrick O’Donnell explains.

The goal is to find whether a student made a year’s worth of academic growth in a school year, but there are many different methods you could use. Ohio’s system is intentionally complicated.

The simplest way would be to take a student’s test scores on the Ohio Achievement Assessments one year and see if scores were higher or lower the next. But that’s too simple, say experts on the subject.


GTT logo-resize
This series about valued-added, a new way that Ohio is using to measure whether teachers provide a year’s worth of learning to their students, is the result of a partnership between The Cleveland Plain Dealer and StateImpact Ohio. StateImpact reporters Molly Bloom and Ida Lieszkovszky worked with Plain Dealer reporter Patrick O’Donnell and Plain Dealer data analysis editor Rich Exner to produce these stories.

Multimedia

Data

That doesn’t allow multiple years of testing to be used to set expectations for a student, adjust for changes in the tests from year to year, account for missing years of test scores or uncertainties with small numbers of students or to measure students’ performance against others.

SAS Inc., the company Ohio hires to calculate value-added, uses a few regression models that combine multiple variables in its Education Value-Added Assessment (EVAAS). SAS officials say that it takes a complicated formula to account for enough variables to be fair.

The models essentially take all students’ scores and place them on a curve, or, similarly rank them in percentiles. A student’s test scores the following year are ranked and placed on a curve the same way.

In its simplest terms, to meet value added targets, a student will appear at about the same place on the curve or score at the same percentile from year to year. If they’re lower on the curve or have a lesser percentile rating, they do not meet value added. If they’re further along, they score above it.

Those scores are combined to create value-added ratings for teachers and for schools.

Starting in 2010-11, Ohio began linking teachers with their students. Teachers are supposed to receive “credit” for their students’ value-added scores in proportion to how much time they spend teaching each student. For example, two teachers who co-taught a single class might each receive credit for half of each student’s value-added score.

SAS has published technical overview papers describing the model used to calculate Ohio’s value-added scores, but declined to release the exact model and data-processing steps used to calculate the scores. The Ohio Department of Education doesn’t have a full copy of the model.

Concerns about value-added being a “black box” were also discussed in these Plain Dealer articles in 2011, along with other issues that critics raise about it. One showed that value added was playing a growing role in evaluating Ohio schools and teachers and the other that Ohio teachers would soon be graded on students’ academic growth.


With StateImpact Ohio reporter Molly Bloom

Comments

  • Laura H. Chapman

    The formula is proprietary. What variables are included to isolate all other factors that might produce variations in test scores? The teacher is not the only or the most significant factor. NOT. Less than 18%, as small as 3% of the variation in scores can be attributed to a “teacher effect.” A sophisticated formula might have a combination of 50 demographic factors, beginning with Ohio and Cleveland metro demographic info, also a bunch of descriptor for the student, and school. All are numbers and weighted and estimated for their influence on the test scores. This information is important for teachers and the public to know. because it could offer some confidence that the variations in test scores are not random, are not measurement errors, are not due to the intersections of social class and parental income, and not due to the proportion of students who are still learning English, are not due to the proportion of students that are disabled with at least categories for that ( not/ yes ), and not due to a high number of students who have been absent or tardy, or have entered school late,or imputed values to missing data, and so on. I ask again. Show me a corporate report where the CEO announces that past performance predicts future performance. If you want teachers to teach to the test fine. If you want teachers to treat kids who fail as if they will keep on failing, all you need to do is keep up this silly mantra that VA scores are “objective,” and the greatest thing since sliced bread. Every reputable statistician who is not trying to make a mint from selling this scam has, and will tell you the scores should not be used to make any personnel decisions about individual teachers. They are unlikely to tell you all of the assumptions they make while processing the scores because the assumptions and their implications are “too complicated” for most people. References supplied on request.

  • Dr. Donna Feldman

    Well said, Laurie!

    I’d like to add a couple of my pet-peeves. EVAAS (and other student growth measures) assumes that the assessments are reliable and valid. Do we have evidence of this? From whom?

    EVAAS uses statistical principles but does not use statistical significance and can’t; the sample sizes used for calculating elementary school teachers are too small (most classes have under 30 students, the tradition sample size needed for statistical significance).

    I am concerned at how much this so-called evaluation tool costs. PARCC, an assessment tool for teachers, is estimated at $186 million(http://www.fldoe.org/parcc/pdf/budgetsumm.pdf); NIET (who brought us the details of the Ohio Teacher Evaluation system) received a minimum of $72 million in 2012 (http://www.niet.org/niet-newsroom/niet-press-releases/new-15-2-million-federal-teacher-incentive-fund-grant-to-strengthen-educator-excellence-and-student-achievement-in-tennessee/). How much does EVAAS cost? If there is true concern for the performance of students in high poverty schools, class sizes in high poverty, low performing school would be reduced (numerous studies show the importance of individual or small group instruction for struggling readers; Feldman, 2009).

    A huge flaw in the system overall system, which can certainly impact value-added scores, is student attendance. The state allows students to be absent for 60 days before excluding them from teachers’ student growth scores; a student must be absent at least a third of the school year before being excluded.

  • kimhil

    “,,,
    SAS has published technical overview papers describing
    the model used to calculate Ohio’s value-added scores, but declined to
    release the exact model and data-processing steps used to calculate the
    scores. The Ohio Department of Education doesn’t have a full copy of the
    model…Concerns about value-added being a “black box” were also discussed in
    these Plain Dealer articles in 2011, along with other issues that
    critics raise about it. One showed that value added was playing a growing role in evaluating Ohio schools and teachers and the other that Ohio teachers would soon be graded on students’ academic growth…” – this doesn’t sound good – searching for information about Utah and Ohioans against common core may be useful; also Agenda 21 goals seem to fit comfortably within common core standards. On you-tube is democratsagainstagenda21 – we are – not experiencing politics as usual.

About StateImpact

StateImpact seeks to inform and engage local communities with broadcast and online news focused on how state government decisions affect your lives.
Learn More »

Economy
Education