Toledo Public Schools teachers face new evaluation

Measurement of instructor quality to begin this school year

Connie Solano, right, a representative of the system's developer, helps TPS’ Glenna Feller and John Krajeski.
Connie Solano, right, a representative of the system's developer, helps TPS’ Glenna Feller and John Krajeski.

Teacher measurement based in part on student performance will make a slow crawl into Toledo Public Schools this year.

Through a mix of state law and the federal Race to the Top grant program, the district will introduce a statewide teacher evaluation system that uses student test scores, in part, to rate teacher performance. A key measure that will be used is something called value-added. Value-added is, at its core, an attempt to quantify teacher quality. How it does so is complicated, and how well it does is a source of debate.

"It's relatively new," Jim Gault, TPS chief academic officer, said. "I'm a little uneasy as regards to the validity of it, but as of right now it's the best thing we have."

Kevin Dalton, Toledo Federation of Teachers president, said he doesn't oppose the use of value-added but is wary of its validity and about it being used as a tool to demonize teachers. It needs to be put into context, he said.

"It's not that we are opposed to value-added," he said. "It's just that it can't be the single measure."

Educators have long argued that it would be unfair to judge teachers based on test scores, because students all start the year with varying levels of knowledge. A teacher who has a class full of students who are all two grade levels behind in math could be strongly effective, yet all that teacher's students may still fail state standardized tests.

Value-added tries to rectify that bias by measuring student growth over the school year, not just their final scores. Battelle for Kids, the company contracted by the Ohio Department of Education to develop the state's value-added system, develops a formula that measures average growth for a group, say fourth graders in Ohio. The contents of that formula aren't public.

Then, the system uses past performance of an individual student and estimates how well he or she should do. It then compares that student growth to the group's growth.

Because the system uses group projections and estimates, results are expressed in standard deviations, further complicating the process. The ranges show whether a student gained as much as, less than, or better than expected. Students in a class are then composited, with that composite applied to a teacher.

Even value-added proponents say that accurate ratings need significant sample sizes, advocating that three years' worth of data be averaged to get a valid rating.

For years, value-added has been used in scattered states and school districts as an internal tool to help identify exceptional, and struggling, teachers. But a wave of states are adopting it to varying degrees in new evaluations that could one day determine teacher pay, placement, and employment.

That's made many skeptical. Mr. Dalton said the union doesn't outright reject performance pay, as long as the system to determine performance is accurate and equitable. Right now, he said, the framework isn't fair and isn't accurate. "If and when a fair application of that framework is developed," he said, "TFT would support that."

John Korenowsky, McKinley Elementary School principal, grew increasingly perplexed at the computer as he manipulated columns meant to represent students and staff.

Surrounded by principals and teachers training in value-added reporting, he tried looking for a teacher to build a mock building profile. He couldn't find her.

"We haven't seen the lady since November," he mumbled.

The high stakes of value-added means accuracy takes extreme importance, especially when it comes to determining students' previous achievement. Principals and select teachers attended training this spring on how to link teachers and students on computer systems in order to guarantee that future reports are accurate.

The linkage is an example about how complicated value-added will be and how much more data will need to be retained to get accurate ratings for teachers.

In urban, high-poverty districts such as Toledo Public Schools, many students move often from school to school. A classroom roll call taken at the beginning of the year may be totally different than at the end of the year, said instructional planner Kay Wait, a TFT member helping with value-added implementation.

It wouldn't be fair, or accurate, to link a student's score to a teacher he or she was with for only a month. So there are criteria for how much contact time a teacher must have with a student before the score counts on a teacher's composite. Principals are supposed to alter class rolls to put students in the right place, while teachers ensure accuracy afterward.

The process can be tedious and confusing. And it can lead to sometimes absurd scenarios that could create misleading results.

For instance, scores for special education students who take alternative assessments aren't linked to a teacher. Nearly all of a McKinley seventh-grade teacher's class were special education students; for only one did a test count toward that teacher's value-added composite, and he happened to test well.

"That means she's an excellent teacher, I guess," Mr. Korenowsky said sarcastically.

Under the state and Battelle's system, that teacher wouldn't get a report. But what happens if the situation applies to a teacher three years in a row? How would that affect the teacher's pay under a performance system?

The state's evaluation system in part recognizes those limitations.

While student-growth metrics will constitute 50 percent of a teacher's evaluation, only 20 percent must be value-added -- districts can come up with other growth metrics, and that's exactly what TPS plans to do.

A likely addition to the value-added scores will be formative assessments, those short-cycle assessments teachers do throughout the year to gauge how well students pick up concepts.

An example may be something such as growth on Dynamic Indicators of Basic Early Literacy Skills, a fluency assessment in which students read for one minute. Teachers can gauge growth by how many words a student reads in that minute from a script.

And 50 percent of evaluations will be from more traditional in-classroom observations, although the format is changed. Exactly how the whole process will work is not yet clear.

The district will conduct a dry run of sorts this year with the new evaluation system.

Teachers in the Toledo Review and Alternative Compensation System -- highly rated teachers who receive extra pay for taking tough assignments -- will face the additional evaluations and the assessment scores, although the scores will be for informational purposes only.

District officials will develop which additional student growth metrics, such as formative assessments, will be used with value-added, and teachers will give feedback about the process, Mr. Dalton said.

And so while questions remain, student growth measures are on their way to TPS.

"At the end of the day, we are measured by student growth," Mr. Gault said, "and I think that's something we all have to embrace."

Contact Nolan Rosenkrans at: or 419-724-6086.