Ohio

Eye on Education

Grading the Teachers: Measuring Teacher Performance Through Student Growth

Forest Park Middle School teacher Maria Plecnik helps 13 year-old Chandalay Coleman with a writing assignment. Maria Plecnik received a low

Lynn Ischay / The Plain Dealer

Forest Park Middle School teacher Maria Plecnik helps 13 year-old Chandalay Coleman with a writing assignment. Plecnik received a low value-added rating from the state even though her principal and students give her high marks.

Schools get rated based on how well students perform on standardized state tests.

Not so for teachers. Their main evaluation comes from often brief classroom observations by a principal.

Practically no one fails.

The new value-added measurement Ohio is phasing in aims to gauge how much a student learns from one year to the next, and how much an individual teacher contributed to those results.

What is Value-Added


GTT logo-resize
This series about valued-added, a new way that Ohio is using to measure whether teachers provide a year’s worth of learning to their students, is the result of a partnership between The Cleveland Plain Dealer and StateImpact Ohio. StateImpact reporters Molly Bloom and Ida Lieszkovszky worked with Plain Dealer reporter Patrick O’Donnell and Plain Dealer data analysis editor Rich Exner to produce these stories.

Multimedia

Data

Matt Cohen, the chief research officer at the Ohio Department of Education says there should be a link between student performance and teacher evaluations.

“If we say that teachers are very important to the instruction, to the learning of kids, and if we believe that, and most people do believe that, then there should be a connection,” he says.

“That’s one of the great things about value-added,” says John White with the North Carolina statistical analysis firm that Ohio hired to calculate the new value-added numbers.

White explains value-added like this: “Teacher value added uses all available student testing history and links the individual students that are connected to teachers in specific subjects and grades to measure the amount of progress those students are making.”

The model predicts how much improvement students should make based on past results.

In a nutshell, if a student ends up performing better than predicted, the teacher gets the credit with a high “value-added’ grade. But if the student scores less than expected, the teacher gets the blame and a low grade.

Eventually, this grade will be a major component that determines a teacher’s pay and employment.

“We’re basically measuring whether or not they maintained their same relative position with respect to the statewide student achievement from one year to the next,” says White.

What the Data Shows

“To say that a teacher’s very low on value-added doesn’t in and of itself tell you that that’s a bad teacher. We can’t say that, and we’re not trying to say that. We are trying to say that’s a piece of information that a teacher, and the school should make use of.”

- Matt Cohen, Ohio Department of Education

An analysis by StateImpact Ohio and the Cleveland Plain Dealer found that most teachers fall in the middle of the rankings.

Our findings also show that students in wealthy districts are three times more likely to have teachers with the highest value-added scores than their peers in high poverty schools, who are more likely to encounter teachers rated “least effective.”

Or, to put it another way, teachers in poorer districts overall aren’t doing as well as their peers in richer districts at adding a year’s worth of knowledge.

“To say that a teacher’s very low on value-added doesn’t in and of itself tell you that that’s a bad teacher,” ODE’s Matt Cohen says. “We can’t say that, and we’re not trying to say that. We are trying to say that’s a piece of information that a teacher, and the school should make use of.”

Cohen says student test scores aren’t the only thing that matter; classroom observation and other tools will help determine a teacher’s final evaluation.

Teachers React

Some teachers who’ve been part of this experiment don’t take a lot of solace in that caveat.

Forest Park Middle School teacher Maria Plecnik says her bosses, colleagues and students all say she’s a highly effective teacher. Her latest evaluation backs that up.

“The only person who doesn’t find me effective is the state of Ohio who has never stepped foot in my classroom,” Plecnik says.

“The only person who doesn’t find me effective is the state of Ohio who has never stepped foot in my classroom.”

- Maria Plecnik, 7th grade teacher

Last year, her value-added score was “least effective.” This was Plecnik’s last year of teaching – she quit the profession.

Other teachers have taken the news more positively.

Emily Brown is a Toledo teacher who saw her ranking in 7th grade reading slip from “most effective” to “average” last year.

She says she’s not discouraged, and the results from value-added may be useful, “because how are you going to know if they gained anything?”

Comments

  • Rep Voter

    Only 505 days till this Republican in Cincinnati votes for a Democratic Governor!!!!

  • Raoul Duke

    My parents were teachers…they came from a long line of educators…and I have very close friends are also teachers. For my entire life, I’ve heard how unfairly teachers are treated from a professional standpoint. How, despite the fact that they are required to get advanced degrees, they are not respected in the same way as someone who earns a J.D. or an M.B.A. How they want to be treated with the same level of respect as someone with an advanced degree would be treated in “Corporate America.”

    I work in that magical land called “Corporate America.” My peers think I’m pretty great. They seek my advice on a daily basis for solving this issue or that problem. My immediate manager writes glowing reviews of my performance. Their manager agrees, and adds her own glowing reviews of my performance. Strangely enough, most of it sounds exactly like the feedback Ms. Plecnik received in her manager review: fluffy, generic, interchangeable and ultimately meaningless. Feedback so generic, any of my coworker’s names could be…and frequently are…substituted for mine for their ‘individual’ performance reviews.

    However, when it comes time for dolling-out year-end review scores, or, even more fun, “Organizational Changes” (that’s business-speak for ‘layoffs,’) the person that makes the decision on “who gets what score,” or “who stays and who goes,” sits in an office 600 miles away, completely outside of my organizational structure…who doesn’t know my name, but probably knows the name of the janitor’s children…ultimately does not care what my manager, their manager, or my peers think about my performance…and has certainly never set foot in my office, or reviewed one single page of my work. They say, “we can award X number of ‘superior’ scores, Y number of ‘good’ scores, and Z number of ‘low performing’ scores”….or “we need to reduce personnel by X number of people,” and they do it without a single thought as to what my manager, their manager, or my peers, thinks of me.

    From this article, it sounds like teachers, Ms. Plecnik included, got exactly what they wanted…to treated like everyone else.

  • Random hobo

    About Education Sector
    All Blog Posts
    Quick Hits

    Home
    About »
    Higher Education »
    K-12 Education »

    Should Students “Grade Teachers”?
    7
    19 Oct 2012 by Craig JeraldCategories: Accountability and Standards, ESEA/No Child Left Behind, K-12 Education
    “I would love to have the students grade the teachers at the end of the year as opposed to just the other way around so that teachers get feedback,” Mitt Romney told an audience at the NBC News “Education Nation” Summit in New York a few weeks ago. To a lot of education policy insiders, that seemed to be reference to the increasing use of student surveys as an additional measure for evaluating and providing feedback to teachers.

    Romney’s remarks came hot on the heels of a long article headlined “Why Kids Should Grade Teachers” by Amanda Ripley in the Atlantic. DC Public Schools had granted Ripley access to observe its four-month pilot implementation of the Tripod surveys in six schools earlier this year.

    The problem is that nobody anywhere is really asking students to “grade teachers,” and when journalists, pundits, and presidential candidates call it that, they risk undermining the very tool they seek to champion. As use of student surveys has spread, so too have serious misperceptions about how the surveys solicit students’ input. And that could translate into a very serious problem when it comes time to ask even more teachers to buy into the process.

    Consider a damning comment that teacher “beccaA” wrote in July on Education Week’s Teacher Beat blog. She said that after her school piloted a new evaluation system this year, a colleague asked some of her students about the survey questions. “‘So how did you guys rate me’ was asked and answered. It seems they rated her ‘bad’ in this area. ‘Why,’ she asked the group of ten. Their answer [was], ‘you make us listen, speak quietly, and do our work.’ And there you have it, without the explanation she would be judged improperly.”

    That comment reveals a fundamental misunderstanding of what the surveys ask students. Students are never asked to provide some kind of summative rating or grade for the teacher or the classroom. Instead, they are asked a series of carefully worded questions about their classroom experiences that measure specific kinds of instructional practices and classroom conditions that are conducive to student learning. In fact, beccA’s colleague would have performed relatively better, not worse, based on the responses her students seem to have given!

    On Wednesday the Gates Foundation’s Measure of Effective Teaching (MET) Project quietly released a new trio of papers about student surveys, including a 24-page brief. That brief, Asking Students About Teaching, provides the clearest and most detailed explanation yet of the benefits and challenges of administering student surveys to evaluate and improve teaching.* If you supply your name and e-mail address, the website even allows you to download the actual surveys the MET Project has administered in thousands of classrooms.

    Instead of “grading teachers,” students answer questions about specific aspects of the classroom and their experiences in it. For example, to get at how well teachers clarify the content they are teaching, students respond on a five-point scale to statements such as this:

    “If you don’t understand something, my teacher explains it another way.”
    “My teacher has several good ways to explain each topic that we cover in class.”
    “My teacher knows when the class understands and when we do not.”
    When the surveys are administered carefully, the level of agreement in students’ responses is surprising high. After all, students spend hundreds of hours “observing” classroom instruction every year.

    To be fair, nothing in the actual body of Amanda Ripley’s excellent Atlantic article creates misperceptions about such surveys. It’s an engaging and informative piece of journalism, one that offers a great complement to the valuable new papers the MET Project released Wednesday. Just skip the headline.

    * Disclosure: My friend Jeff Archer wrote the MET paper, and I am currently working with the Gates Foundation on an unrelated project.

    Accountability & Standards, ESEA, K-12, Teacher Evaluation, Teacher Feedback

    About the Author
    Craig D. Jerald, a member of the Education Sector K20 Task Force, is an education policy consultant and president of Break the Curve Consulting. Jerald focuses on issues of school and teacher evaluation, teaching effectiveness, and professional development.
    Subscribe to my feed
    Social Share

    Related Posts

    What To Think About the New York City Teacher Value-Added Scores
    Don’t Let the Perfect Be the Enemy of the Good
    A Premortem for Professional Development (Part 2)
    7 Comments

    E Learning October 31, 2012 at 1:46 am
    Dear Jerald This is Pandiarajan from India, I am having a E Learning Company in Chennai, through my organization, we are offering online English Learning Courses, regarding this i have to meet many school principals and students. I read your article, this is awesome, it gives me lot more points to talk with students. And i have some clear idea about the future improvement. The way you have defined the Should Students grade teachers is a good think, because the management and parents will know the quality and strengths of Teachers. My English learning program is having options to grade teachers. Parents and management can also check the students and teachers activities through their login. Thanks for this awesome article, i will keep follow your blog, and i will suggest my staffs to follow you. If you need any E Learning materials, or Link Exchange, feel free to contact me via mail. Looking forward from you. Thank you.
    Pingback: “Two Cheers for Gates Foundation Student Survey Research … | The Foundations Research Site

    Pingback: “Two Cheers for Gates Foundation Student Survey Research” — Guest Post By Dr. John Thompson | Larry Ferlazzo’s Websites of the Day…

    Tondja Woods October 20, 2012 at 7:19 pm
    I would encourage evaluation of teachers by BOTH parents and students. Parents are often less out of the loop. When feedback is proactively provided to administration, it is rarely welcomed or acted upon. I value and respect my son’s teachers. He attends a school where contracts are annually renewed. My peers are not as positive about the public options. Just a parent’s opinion.

    Top Online October 20, 2012 at 1:23 pm
    Very good information we have to bear in mind.

    Craig Jerald October 20, 2012 at 10:41 am
    Thank you for your comment, Laura. I appreciate your skepticism. However, your criticisms seem to be based on some serious misunderstandings about how the surveys are administered and how the results are used. I’ll try to address those one by one. First, in reference to the student background questions, you say that, “Teachers are not responsible for and cannot change these status questions.” That’s true. But no one is holding teachers accountable for students’ ethnic and socioeconomic characteristics, and no one is expecting teachers to change them. The background questions are supplemental and for analytical purposes only. Responses to those questions allow teachers and administrators to examine patterns in students’ responses to the main questions about classroom practices and learning environment. I see from your National Arts Education Association biography that you’ve consulted for ETS, so here’s a useful parallel: The Advanced Placement exams ask students such supplemental background questions, too. But those answers are not factored into students’ AP scores. Instead, they are used to conduct research on equity of educational opportunity, such as the 2008 ETS report “Access to Success: Patterns of Advanced Placement Participation in U.S. High Schools.” Second, you say, “A person not employed by the school, or accountable to it, asks elementary students to answer 67 questions about their teacher.” That is not true. School districts or individual schools choose to administer the surveys, and the smartest ones systematically engage teachers in making important decisions about how to do so. For example, as the MET paper describes, when teachers in Denver told school district leaders that the surveys took too long for students to complete, leaders invited teachers to help decide how to streamline the survey questions while still ensuring validity of results. Third, you complain that the surveys ask questions about whether students are well-behaved, happy in class, and feel academically challenged, because that imposes some kind of “one size fits all” version of teaching. Yes, the best student surveys are indeed based on a coherent theory of instruction about how specific kinds of classroom practices support student engagement and learning. But I would argue that is an advantage rather than a drawback. The stunning lack of agreement about effective instructional practice has damaged the teaching profession and crippled efforts to help teachers learn from and with one another. In fact, it is only through some kind of agreement about a body of good practice that professionals can continuously develop their expertise and ensure a standard of quality. Would you consent to be operated on by a surgeon who rejects professional standards of surgical practice as “one size fits all”? And that gets at your final criticism, that the surveys are somehow part of a plot to conduct “unparalleled surveillance” and punish teachers. While I suppose a school district could use surveys that way, that’s not what I’ve seen. For example, teachers in Memphis and Charlotte-Mecklenburg have signed up for a professional development initiative based on the survey responses that offers intensive one-on-one coaching or access to online resources and discussion forums. Like any measurement instruments, study surveys can be used for good or ill. I encourage you to remain vigilant about how the surveys are being used (I plant to), but I also encourage you to become more familiar with the basic facts before jumping to conclusions.

    laua h chapman October 20, 2012 at 1:52 am
    Lets see, A person not employed by the school, or accountable to it, asks elementary students to answer 67 questions about their teacher. Secondary students answer 97 questions. But the questions ask: How many adults are in your home? how many books are in the room where you sleep? do you have a computer at home? what is your race/ethnicity? how many children live in your home? is English spoken at home? what level of education do adults at home have? What adult was with you when you took this survey? Teachers are not responsible for and cannot change these status questions. The survey is part of a system of unparalleled surveillance of students and teachers, proudly pushed and largely funded by Bill Gates, who thinks education is a management problem cured by “accurate” data-gathering, one-size fits all instructional strategies, these perfectly aligned with rigorous standards, standardized tests, and the rest. The “vendors” of this test and related products in the teacher-student data link project are concerned about the breaking laws bearing on privacy. No wonder. Someone should blow the whistle on this effort. Teachers are “good” if students are compliant, well-behaved, feel happy in class, and so on. Both surveys are preoccupied with homework assignments, and whether teachers are pushing kids hard enough. One size fits all or else. Together we go racing toward the worst education system in the world but with the largest data warehouse in the world to justify stupid decisions about educational policies. These angry remarks are from a long time worker in arts education, also trained in educational research, including ethical practice. This goes way over the line with a captive audience, minors, and not a sign of parental permission for the data-mongering.
    Subscribe To Our Feed

    See More Options »
    Stay Connected

    Sign Up for ES Email Alerts!

    Recent
    Popular
    Comments
    Career and Technical Education Month: Developing and Supporting Great Teachers
    19 Feb 2014 , 0 Comment
    Three Takeaways on College Ratings from the PIRS Symposium
    18 Feb 2014 , 0 Comment
    A Stopwatch Shows the President’s Priorities
    06 Feb 2014 , 0 Comment
    Education Sector Tweets

    Education Sector
    EducationSector
    EducationSector February is Career & Technical Education Month! Read our new blog post on developing & supporting great #CTE teachers http://t.co/zNSzpXToLh
    1 day . reply . retweet . favorite
    EducationSector New blog post: “Three Takeaways on College Ratings from the PIRS Symposium” by @Education_AIR ‘s Andrew Gillen http://t.co/KNAphkwHfT
    2 days . reply . retweet . favorite
    EducationSector Read @nytimes new article: “The New College Campus,” which looks closer at the latest @DeltaCost report http://t.co/IRb2vlydMp
    2 days . reply . retweet . favorite
    EducationSector “The AIR report suggests, among other factors, a growing need for employees dedicated to fund-raising… ” @nytimes http://t.co/XXIa2Dy8KC
    2 days . reply . retweet . favorite
    EducationSector @nytimes article delves into the new @DeltaCost report on salaries in #highered http://t.co/6qp7aEMcZ2 read report: http://t.co/6qp7aEMcZ2
    2 days . reply . retweet . favorite
    join the conversation
    Education Sector in the News
    9/4/2013| Arlington Magazinediscusses Sara Mead’s report “The Truth About Boys and Girls.”
    8/30/2013| The Washington Posttalks about Andrew Gillen’s “Higher Ed Data Central” blog post.
    7/2/2013| USA Today talks to Andrew Gillen about default rates and his Chart You Can Trust “In Debt and In the Dark.”

    Archives

    Browse Our Issues

    Our Most Recent Posts

    Career and Technical Education Month: Developing and Supporting Great Teachers
    Three Takeaways on College Ratings from the PIRS Symposium
    A Stopwatch Shows the President’s Priorities
    Is ‘Admin Bloat’ Behind the High Cost of College?
    Blog Special Analysis: Beyond ‘College Value Ratings’ Part 2
    Recent Comments

    NorthStar TeacherTutors – Provocative Argument for Larger Class Sizes on Don’t Stress about the Class Sizes. Focus on the Teachers
    Robert Kelchen on Blog Special Analysis: Beyond ‘College Value Ratings’ Part 2
    The Quick and the Ed » Blog Special Analysis: Beyond ‘College Value Ratings’ Part 2 on Blog Special Analysis: Beyond ‘College Value Ratings’
    The Quick and the Ed » Education in the State of the Union on Blog Special Analysis: Beyond ‘College Value Ratings’
    The Quick and the Ed » Education in the State of the Union on How Does the Quality of Your Preschool Compare to Other Providers in Your State?
    Blogroll

    Brainstorm
    Center for College Affordability and Productivity
    College Guide
    Core Knowledge
    Early Ed Watch
    EdIntercepts
    Education Gadfly Daily (Fordham Institute)
    Education Week Blogs
    Eduoptimists
    Eduwonk
    GothamSchools
    Hechinger Institute Blogs
    Higher Ed Watch
    Jay P. Greene
    Linking and Thinking on Education
    PIE Network Blog
    Sherman Dorn
    This Week in Education

    © 2013 Quick & The Ed

About StateImpact

StateImpact seeks to inform and engage local communities with broadcast and online news focused on how state government decisions affect your lives.
Learn More »

Economy
Education