Is your son’s math teacher a good one? How about your daughter’s reading teacher?
You used to have to depend on the parent grapevine to find out. Now there’s another source.
The state’s new value-added ratings offer a look at the performance of individual teachers, based on whether their students make the expected academic progress during the school year — as calculated solely by scores on the Ohio Achievement Assessments. The state gives math and reading teachers in fourth through eighth grades one of five value-added ratings, ranging from “Most Effective” to “Least Effective.”
This series about valued-added, a new way that Ohio is using to measure whether teachers provide a year’s worth of learning to their students, is the result of a partnership between The Cleveland Plain Dealer and StateImpact Ohio. StateImpact reporters Molly Bloom and Ida Lieszkovszky worked with Plain Dealer reporter Patrick O’Donnell and Plain Dealer data analysis editor Rich Exner to produce these stories.
- Overview: Using Data To Evaluate Teachers
- Pay vs. Value-Added Performance
- Secrets Of Two “Most Effective” Teachers
- Value-Added’s Poverty Factor
- How is Value-Added Calculated?
- Audio: Measuring Performance Through Growth
- Audio: Push for Performance Pay
- Video: Guide to Ohio’s New Way of Evaluating Teachers
Not all teachers are listed. The state started calculating value-added for close to 6,300 teachers in the 2010-11 school year and increased that to more than 16,000 for 2011-12, before doing it for all reading and math teachers in grades four through eight after this upcoming school year.
Our database includes only the 4,200 teachers who have two years of scores, because those scores can vary from year to year. The ratings listed use a composite score, reflecting teachers’ performance over two years.
Public release of value-added ratings in other states has drawn sharp criticism. Teachers protested when the Los Angeles Times independently calculated and published value-added scores in 2010. The New York Times drew similar howls last year when it sued New York City for access to the district’s teacher value-added ratings and published them.
Critics called the release an invasion of teacher privacy and complained that unreliable results could place an unfair stigma on low-scoring teachers. They also argued that making scores public would make teachers defensive and damage schools’ ability to use the scores to improve teaching.
Teachers in Ohio are often anxious about their scores appearing in public.
“It’s scary to think that they’re going to start publishing these in the newspaper,” said Alesha Trudell, who teaches fourth-grade language arts and social studies at Hilltop Elementary in Beachwood. “You know they’re going to say your name and it’s going to put this stigma out to the community that ‘Oh look, she was Most Effective one year and now she’s not so effective.’”
Plain Dealer and StateImpact editors said they considered those concerns but decided it was more important to provide information — even if flawed — to help parents understand their children’s education and for the public to better understand a measure increasingly used by the state and school districts.
Plain Dealer Assistant Managing Editor Chris Quinn said there are several reasons to make the ratings available.
“One is that state lawmakers created the value-added system to come up with a better way to assess teachers, to give the residents of the state better accountability,” he said. “Another is that tax dollars are used to compile the ratings, meaning the people of Ohio have paid for this.
–David Molpus, WCPN Ideastream
David Molpus, executive editor of WCPN Ideastream, which manages StateImpact Ohio, said that listing the ratings along with the jointly produced three-day series of articles about the measure helps the public evaluate teachers — and the evaluation process.
“The series highlights trends and general conclusions but, ultimately, this does come down to individuals,” Molpus said.
“My caution to readers and listeners would be, as Ohio education officials have said, these scores are only a part of the criteria necessary for full and accurate evaluation of an individual teacher. There are a lot of questions still about the particular formula Ohio is using and which variables beyond a teacher’s control need to be considered in arriving at a fair and accurate formula.”
“We hope the series provides a context for the new state data and will launch many more conversations on where this all goes from here,” Molpus said.
When viewing the ratings online, readers should note:
- The company that calculates value-added for Ohio says scores are most reliable with three years of data. Ohio will use three-year rolling averages of value-added scores when it has them for teachers. But two years of scores from the pilot are available for some teachers now, and districts can use fewer than three years when teachers have been teaching fewer years or if three years of data are not available.
- Ohio’s value-added ratings do not account for the socioeconomic backgrounds of students, as they do in some other states.
- Value-added scores measure students’ progress only by their performance on the Ohio Achievement Assessments.
- Value-added scores are not a teacher’s full rating. Ohio law requires value-added to be 50 percent of teachers’ total rating, when available, but classroom observations and measures of professionalism make up the other half.