Eye on Education

Five Things to Learn From Ohio’s New Teacher Preparation Program Evaluations

Gates Foundation / Flickr

Ohio colleges and universities produce about 6,000 teachers a year.

Now, for the first time, there’s a way to tell which colleges of education are doing a good job and which have, well, room for improvement.

The Ohio state agency that oversees higher education — the Board of Regents – has released a set of “performance reports” about the state’s teacher preparation programs. These are the programs that teach the teachers.

[Click here for the reports.]

The reports make Ohio one of just eight states that connect the colleges of education to the performance of their graduates’ students.

In addition to information about how good recent graduates from each school are at actually teaching students reading and math, the reports include information about how many prospective teachers from each college pass licensure exams and graduates’ own ratings of their schools.

Just don’t call these things report cards.

The reports from the Ohio Board of Regents don’t actually grade schools.

“In higher ed we’re not going to grade our schools,” said outgoing Chancellor Jim Petro. “We’ve got some great schools, but we’re not going to start giving them letter grades.”

Still, one of the ideas behind the reports is to give taxpayers information about how well some of the biggest public-employee training programs operate, said the Board of Regents’ Rebecca Watts, who supervised the creation of the reports.

“Taxpayers make a big investment in their schools. They make a big investment in public universities,” she said. “This is a performance report on programs that are directly supported by tax dollars.”

Report Highlights

1. Some schools do a better job of preparing teachers than others.

Seven of Ohio’s 51 teacher preparation programs had more than 95 percent of their recent graduates meet or exceed a measure called value-added.

Value-added is a statistical measure that shows whether students made a year’s worth of academic progress in a given year, regardless of what they knew at the start of the year.

Seven had less than 80 percent or less doing so. The rest were somewhere in between.

(The value-added measure only includes people who teach math and reading in grades 4-8, so for some colleges of education it includes a relatively small number of their graduates.)

2. Teachers don’t understand value-added.

Value-added is becoming really important in Ohio schools. It’s starting to be tied to decisions about how much to pay teachers and which teachers to lay off and rehire.

But these new reports show that in general, teachers don’t think their colleges did a great job of explaining how value-added works. In response to this documented confusion, some schools are already changing how they teach prospective teachers about value-added. (Here’s one place to start: StateImpact Ohio’s explainer on value-added.)

3. Just about everyone who takes the teacher licensure test eventually passes it.

Only one school (Central State University) had a teacher licensure test passing rate below 88 percent. At nearly half of all Ohio teacher preparation programs, 99 percent or more of graduates passed the test.

4. Future teachers of Ohio say their colleges did a good job of preparing them to teach.

On average, recent teacher-prep program graduates agreed their programs prepared them to teach. But graduates of private colleges tended to give their institutions higher marks than graduates of public colleges. One notable exception is graduates of Ohio State University.

5. The bar to entry is about a C+.

About half of Ohio teacher preparation programs require students to have a minimum grade point average, or GPA, to enter the program. The average minimum GPA required was a C+. Only one school (Cedarville University) required a GPA of B or higher.

WKSU reporter Amanda Rabinowitz contributed to this report.


  • duckmonkeyman

    Anybody who thinks kids are statistics fed into an “value-add” model are out of touch with reality. Value-add bases an entire year’s worth of work on a two hour test. The methodology is unproven, flawed, and obscure. Everyone jumps on the value-add bandwagon but has no idea what it is. Collective insanity.

About StateImpact

StateImpact seeks to inform and engage local communities with broadcast and online news focused on how state government decisions affect your lives.
Learn More »