No gold stars for LA teachers

Los Angeles doesn’t reward, recognize or try to learn from its most effective teachers, reports the LA Times in a follow-up to its value-added analysis of third- through fifth-grade teachers’ effects on their students’ test scores.

The Times found that the 100 most effective teachers were scattered across the city, from Pacoima to Gardena, Woodland Hills to Bell. They varied widely in race, age, years of experience and education level. They taught students who were wealthy and poor, gifted and struggling.

In visits to several of their classrooms, reporters found their teaching styles and personalities to differ significantly. They were quiet and animated, smiling and stern. Some stuck to the basics, while others veered far from the district’s often-rigid curriculum. Those interviewed said repeatedly that being effective at raising students’ performance does not mean simply “teaching to the test,” as critics of value-added analysis say they fear.

On average, these teachers’ students improved by 12 percentile points on tests of English, from the 58th to the 70th, and 17 percentile points in math, from 58th to 75th, in a year.

Thomas Kane, a Harvard education researcher, tested the reliability of the value-added approach in Los Angeles, the Times reports.  Kane predicted the student gains for  156 teachers who volunteered for the experiment.

Value-added analysis was a strong predictor of how much a teacher would help students improve on standardized tests. The approach also controlled well for differences among students, the study found.

With $45 million from the Bill and Melinda Gates Foundation, Kane and other researchers are now following 3,000 teachers in six school districts to see if other types of evaluation — including sophisticated classroom observations, surveys of teachers and reviews of student work — are also good measures of teacher performance.

In the meantime, Kane said that, although it is not perfect, “there is currently not a better measure of teacher effectiveness than the value-added approach.”

Teachers on the Times’ most effective list said they’d never been recognized for excellence.  Aldo Pinto, a 32-year-old teacher at Gridley Street Elementary School in San Fernando, said, “The culture of the union is: Everyone is the same. You can’t single out anyone for doing badly. So as a result, we don’t point out the good either.”

Value-added is the worst form of teacher evaluation, but it’s better than everything else, writes Chad Aldeman on The Quick and the Ed.

Los Angeles Unified now plans to share value-added data with teachers privately and hopes to negotiate its use in teacher evaluations with the teachers’ union.  Tennessee did just the opposite, Aldeman notes. “Every year since the  mid-1990’s every single eligible teacher has received a report on their (value-added) results.”

When these results were first introduced, teachers were explicitly told their results would never be published in newspapers and that the data may be used in evaluations. In reality, they had never really been used in evaluations until the state passed a law last January requiring the data to make up 35 percent of a teacher’s evaluation. This bill, and 100% teacher support for the state’s Race to the Top application that included it, was a key reason the state won a $500 million grant in the first round.

While LA teachers are angry and confused, Tennessee teachers have had time to understand how value-added analysis works and  prepare to accept it.

About Joanne