Year after year, some Los Angeles elementary teachers improve their students’ performance. Some don’t. The Los Angeles Times analyzed seven years of the district’s data to gauge the effectiveness of 6,000 third- through fifth-grade teachers.

At Broadous Elementary School in a low-income Latino neighborhood, Miguel Aguilar and John Smith teach fifth graders.

On average, (Aguilar’s) students started the year in the 34th percentile in math compared with all other district fifth-graders. They finished in the 61st. Those gains, along with strong results in English, made him one of the most effective elementary school teachers in the district.

On this day, Aguilar had invited a student to the board to divide two fractions — a topic on the upcoming state exam. As his classmates compared notes in whispers, the boy wrote out his answer. Aguilar turned to the class.

“Do you agree?” he asked, without hinting at the correct response.

“Yes!” they called back in unison.

“Good,” he said softly, allowing a faint smile. “You know this.”

John Smith’s students in Room 25 were studying fractions too.

Speaking in a slow cadence, he led his class in reciting a problem aloud twice. He then called on a student slouched in the back. The boy got the answer wrong.

“Not so much,” Smith said dryly, moving on to another pupil without explanation.

. . . On average, Smith’s students slide under his instruction, losing 14 percentile points in math during the school year relative to their peers districtwide, The Times found. Overall, he ranked among the least effective of the district’s elementary school teachers.

Times’ reporters looked at students’ prior performance to analyze the “value” added by a particular teacher.

For example, if a third-grade student ranked in the 60th percentile among all district third-graders, he would be expected to rank similarly in fourth grade. If he fell to the 40th percentile, it would suggest that his teacher had not been very effective, at least for him. If he sprang into the 80th percentile, his teacher would appear to have been highly effective.

Any single student’s performance in a given year could be due to other factors — a child’s attention could suffer during a divorce, for example. But when the performance of dozens of a teacher’s students is averaged — often over several years — the value-added score becomes more reliable, statisticians say.

Education, training and experience do not correlate with effectiveness, the LA Times’ analysis found. The most effective teachers have different personalities and styles, though all tend to “be strict, maintain high standards and encourage critical thinking.”

Karen Caruso, certified by the National Board for Professional Teaching Standards, teaches future teachers at UCLA and leads well-regarded Third Street Elementary’s teacher reading circle. The principal considers her one of the school’s best teachers, but the Times’ analysis puts the third-grade teacher in the bottom 10 percent in raising students’ scores.

On average, her students started the year at a high level — above the 80th percentile — but by the end had sunk 11 percentile points in math and 5 points in English.

Caruso said she was surprised and disappointed by her results, adding that her students did well on periodic assessments and that parents seemed well-satisfied.

“Caruso set clear expectations for her students but seemed reluctant to challenge them,” writes a Times reporter who sat in on her class.

Down the hall, fourth-grade teacher Nancy Polacheck, who’s been teaching for 38 years, ranks in the top 5 percent of elementary school teachers.

Polacheck said her colleagues at Third Street think her expectations are too high. She was reluctant to be singled out in any way, repeatedly asking a reporter why she was being interviewed.

“In the past, if I were recognized, I would become an outcast,” said Polacheck, who eats her lunch alone in her classroom. “They’d say, ‘She’s trying to show off.’ “

The teachers named as ineffective — which had to be humilating — said they’ll look for ways to improve. They complained that the district hadn’t shown them the data.

This is the first of a series, which will assess individual teachers for whom there’s seven years of data.

Via the Hechinger Report.

Update: The LA Times series will prove to be the article of the year, writes Tom Vander Ark on National Journal.

United Teachers of Los Angeles, the union, is “really, REALLY peeved” by the “potentially explosive” story, writes Stephen Sawchuk at Teacher Beat. There’s talk the union will ask teachers to boycott the LA Times. But now that people know the data is available, they’re going to want to use it, he predicts.

Rick Hess thinks value-added analysis isn’t completely ready for prime time and says the Times should not have used teachers’ names. I have to say naming the teachers made me uncomfortable.

Dan Willingham also thinks value-added analysis is not good enough to use for evaluating individual teachers, though his example doesn’t make sense to me. He adds that a consensus has emerged that something has to be done about incompetent teachers. Most districts don’t have “a mechanism by which to ensure that incompetent teachers are not teaching.”

I have said before that if teachers didn’t take on the job of evaluating teachers themselves, someone else would do the job for them.

. . . This is the time for the teacher’s unions to make teacher evaluation their top priority. If they don’t, others will.

I don’t believe the teachers’ unions can take on this challenge, but they’d be wise to try.

“Not so much,” Smith said dryly, moving on to another pupil without explanation.

?!!?!!?!?!??!

What kind of a moron do you have to be to not explain the thing. Hell, the first thing they teach in military methods of instruction is, “tell them what you’re going to tell them. tell them. tell them what you told them. have them tell you what you told them” Any corporal knows that.

This guy needs special instruction????? How about some basic, really basic, sort-of-remedial common sense?

I’m wondering what it says about the organization that employs these two teachers that the one gets no recognition or reward and the other is still employed?

So, a newspaper can come in, gather the data, analyze it and determine who is kicking butt and who is getting their butt kicked, but the principals and other school district administrators can’t?

I wonder just how much of our recent economic problems have had to do with our population just getting too stupid to be employable. Clearly we’ve become too stupid to borrow money without bankrupting ourselves. Apparently, fewer than half of us can even calculate a tip:

http://redtape.msnbc.com/2009/12/when-i-published-gotcha-capitalism-two-years-ago-i-was-in-for-a-big-surprise-as-i-talked-about-systemic-hidden-fee-fraud-al.html

You know, this education thing, it’s important. It isn’t a game. Those kids that leave high school unable to compute a tip are crippled for life. They’ll never have another chance to get a public education. It’s a one-shot deal; screw it up and you have ruined that person’s life and cost the economy hundreds of thousands of dollars.

In short, education is far too important to let a union come anywhere near it.

I am hoping more newspapers do the same thing – the monopoly of government education needs to be busted…

What’s even worse than the school district continuing to employ some of these idiots is that some are getting recognized for ‘superior’ achievement through NBC. Every school I’ve been at has had teachers seeking NBC because of the bump in pay (school districts in NY love it)… and many non-NBC teachers frankly state that the students whose teacher is seeking it suffer for it. One NBC teacher stated that he did the same activity 5 times with the same class to perfect the video he was shooting.

Hrm.

This study is, as best I can tell, comparative only. That is, it tells us only who the most effective teachers are, not which teachers are effective.

Here’s what I mean:

Assume all teachers are really, really good at their job.

Assume also some degree of movement in exam scores among students — that is, people do better in some years and worse in others.

Assume also that the performance is somehow linked to the teaching or — at the very least — to the teacher (otherwise the study is worthless).

Some students will get better, some will get worse. Everyone can’t get better because we’re basing this on a percentage system: some students will always perform below average no matter how impressive their performance. (I suppose there could be an all-way tie for 100% but let’s be realistic.)

So even if our very good teachers are doing a very good job with their students, our very good teachers are going to be outshone by the frickin’ magnificent teachers whose students improve more.

Now obviously this [b][i]probably[/i][/b] isn’t really the case in the LAUSD. But I’d be wary about extending such studies outside of the most blighted, categorically dysfunctional institutions/districts.

Bah. I’m having to switch daily between brackets and carrots and I apparently am unequal to the task.

I totally agree that this data is another valuable tool in the toolbox of teacher assessment; all teachers should be held accountable. Now I am still waiting for the assessment tools for holding students and parents accountable, because learning is a slow and arduous process over time. Factors outside the classroom actually impact success and failure every bit as much, or more, than purely instruction. Ooops..I forgot…only teachers are currently being held accountable. This “value-added” assessment will join the pile of discarded edufads through recent history. Nothing will ever replace good teaching, and a consistently strong teacher/student/family work ethic, morality and sense of civic responsibility. ‘Nuff said.

1) Does their statistical techniques assume random assignment of students to teachers? Of course it does!

2) Does LA have real random assignment of students to teachers? Of course not!

The paper did a bad statistical analysis — one whose assumptions are violated — and published it itself. That makes it credible? I hope not, but know that people will believe it anyway.

****************

Does the paper acknowledge that there are content standards not included in these tests, or that there are elements of teaching not reflected in the content standards?

I couldn’t find that, in my cursory examination.

Value-added analysis doesn’t require random assignment or information about the students’ family backgrounds. Teacher A starts with students averaging 20th percentile performance and ends the year with students at the 30th percentile on average. Teacher B starts with students at the 80th percentile, on average, who finish at the 70th percentile. Value-added analysis says the first teacher is doing well and the second teacher is not. (The LA Times used seven years of data to avoid a random good or bad year.)

Value-added analysis does assume that test data measures student performance accurately.

We can argue about how much test score changes should count in teacher evaluation; it may (as the story indicated) be a factor that needs to be used very conservatively. But we can’t say that test scores never reveal anything about how well teachers are performing their jobs.

I wish Eduwonkette were here to evaluate this evaluation. Her blog is sorely missed in these times.

“John Smith” – the one that doesn’t bother to explain what the kid got wrong – at least did drill the kids in problems. He was working hard compared to my 8th grade English teacher. My parents checked with the neighbors; this teacher had stopped bothering to even try to teach at least 20 years earlier. For the whole 20 years, every 9th grade English teacher had to know it – they each had several kids that seemed to be good students but were a full year behind, all with the same teacher in the previous year. The school district was pretty good overall, so this really stood out. The one Junior High in this town included grades 7th through 9th, so all these teachers were reporting to the same principal. And yet this teacher who didn’t teach was allowed to foul up kids’ education until retirement.

I’ve got a pretty poor opinion of the principal in charge at the time, for more reasons than just one bad teacher, but the deadwood should have been fired long before that principal was first hired. Many principals and school boards had decided that a confrontation with the union was simply too much trouble.

And Ceolof is one of the idiots – or fiends – that helps principals and school boards keep making those decisions.

Value-added analysis says the first teacher is doing well and the second teacher is not.I’m all for teacher testing, but there are many reasons–particularly in high school–where value added over comparative years, rather than starting point in that year, will be impossible to measure. A student who goes from far below basic to basic in pre-algebra had a good teacher. If he then goes from far below basic to below basic in algebra, he also had a good teacher. It’s just that algebra’s harder to learn.

“I wish Eduwonkette were here to evaluate this evaluation. Her blog is sorely missed in these times.”

I totally agree… and likewise, Gerald Bracey.