Teacher Ratings Released, But Even DOE Downplays Them

February 24, 2012 Brooklyn Eagle Staff
Share this:

By Mary Frost
Brooklyn Daily Eagle

NEW YORK CITY — After a long legal battle, the New York City Department of Education (DOE) on Friday released ratings for 18,000 public school teachers.

Originally intended as a pilot study to help teachers increase their effectiveness, the ratings were never considered to be especially accurate, nor were they designed to be made public. The DOE asked news outlets not to publish individual scores, but several newspapers involved in a Freedom of Information Law (FOIL) filing, including The New York Times, said they would publish the results.

Subscribe to our newsletters

The evaluations, officially called Teacher Data Reports, are based on state test results in English and math for a group of 4th to 8th graders over a three-year period — though figures are incomplete in many cases. A number of teachers said their reports contained glaring inaccuracies, such as grading them for a year when they were actually out on leave, or crediting them with teaching 120 children when they actually taught 200.

Many educators fear that parents who are unaware of the “fuzziness” of the math will misjudge the effectiveness of their child’s teacher.

United Federation of Teachers President Michael Mulgrew said Thursday that the ratings’ margins of error are as much as 54 out of 100 points. “This means that teachers identified as top scorers could in fact be below average, and teachers identified as low scorers could in fact be near the top.”

The reports also are based on test scores that were “deemed so flawed by the state that it mounted a complete overhaul of them in 2010,” Mulgrew said. “The reports don’t take into account any other measure of teaching or learning. Even the Department of Education admits that a teacher’s performance should never be judged primarily by these flawed ratings.”

The union had sued to stop the reports’ release, but lost its appeal.

Schools Chancellor Dennis Walcott said the ratings were useful, but warned teachers in a letter that their names and ratings might show up in print. “The reports gave teachers and principals one useful perspective on how well teachers were doing in their most important job: helping students learn,” he said. “However, these reports were never intended to be public or to be used in isolation.”

De Blasio Blames Bloomberg

Public Advocate Bill de Blasio said that Mayor Michael Bloomberg was behind the release of the evaluations.

“The Bloomberg administration is making a grave mistake by releasing personal teacher ratings. Independent experts — and even the evaluation system’s creator — have found significant flaws in these rankings. It would be irresponsible for any news outlet to print this data or represent it as an accurate portrayal of what really happens in the classroom.”

Sean P. Corcoran, assistant professor of educational economics at New York University and an expert in the “value-added” methodology used by the city, collaborated with the Annenberg Institute on a study of the complexities of the ratings.

In a published analysis, Corcoran said that questions remain as to whether value-added measures “are a valid and appropriate tool for identifying and enhancing teacher effectiveness.” He added that “common sense suggests this information alone can only be a crude indicator for differentiating teaching effectiveness.”

In the state’s new teacher-rating system, test scores will count for 20 percent of a teacher’s evaluation along with classroom observations and other measures, Chancellor Walcott said.


Leave a Comment


Leave a Comment