Best ed schools make a difference

Students’ progress can be linked to where their teachers trained, concludes a study of Washington state education schools Dan Goldhaber of the University of Washington Center for Education Data & Research.

“Improving teacher training has the potential to greatly enhance the productivity of the teacher workforce,” Goldhaber wrote in the report.

Overall, only a small percentage of the differences in teacher effectiveness were linked to education schools, but the best programs were much better than the worst. The effects outweighed smaller class sizes or teacher experience. “Hiring a teacher from the best training program could be equivalent to shrinking a class by five to 10 students,” AP reports.

National Center on Teacher Quality is working with U.S. News and World Report to evaluate and rank all 1,400 education schools in the country. NCTQ’s Transparency Central lists all the letters from teacher preparation programs objecting to the review.

American Association of Colleges for Teacher Education has urged members not to participate in the “fundamentally flawed” project, reports Teacher Beat.

(The AACTE letter)  also calls the review an “outrage,” a “cause for alarm,” and NCTQ’s tactics “unprofessional.”

If education schools refuse to cooperate, NCTQ will file Freedom of Information Act requests to see course syllabi and hire students collect and submit documents.

About Joanne


  1. Deirdre Mundy says:

    Is it that the training programs themselves are better, or that the “best” schools attract smarter, better prepared teachers than the “worst” schools?

    • I can’t imagine how incoming student quality would not be part of the equation. I would also be interested in the ES-MS-HS breakdown, because I have the impression that the ES teachers are the weakest link, and the schools don’t seem to remediate their weaknesses.

  2. Hi Deirdre,

    If you read the study, the authors try to address your question.

  3. It’d be nice to see if the graduates of the best teacher schools are more employable then average.

  4. Michael E. Lopez says:


    I’m not going to actually read the entire study right now… but the article about the study does seem to suggest that the authors don’t address the question at all, which doesn’t quite jive with what you just said to DM.


    But he is quick to point out this research does not reveal why one teacher training program is superior to another. Part of the difference could be attributed to how selective a program is in choosing its students.

    “I think both are important,” Goldhaber said. It doesn’t matter if the program is selecting better candidates or just turning everyone into really good teachers, he added.

    That sounds an awfully lot like “I have no clue whatsoever on this issue.”

    • Deirdre Mundy says:

      Also the whole “It doesn’t matter if a program is just selecting good candidates” leaves open the question of what extra values the “good” programs supply.

      If the difference is just ‘our people are smarter,” maybe we should scrap ed-school all together and just go with subject area knowledge+ apprenticeship as a means of teacher training,

      • Then you’d have all manner of angry ed-school profs at the OWS rallies, carrying signs like “PhD, once tenured, now unemployed”.  Think of their inner children!

  5. Christina Lordeman says:

    I agree that selection probably accounts for these findings rather significantly. Students entering the better ed schools are probably better students. In order to determine if certain ed schools actually produce better results than others, we’d have to compare students who were at comparable performance levels prior to entering ed school.

    And since the article brought it up, I also think it’s time someone pointed out that “classroom experience” in teacher training programs is not all it’s cracked up to be. My MA program at NYU Steinhardt required A LOT of classroom time – I was required to observe and assist in a classroom at least 10 hours a week for the entire first year, and to student-teach at least 20 hours a week for the entire second year – but I found it minimally helpful. I spent an awful lot of time sitting in classrooms watching teachers fail to teach their students, and even though the student-teaching requirement was heftier than what a lot of programs require, I was never required to actually teach a full day’s schedule on my own.

    Classroom experience as an element of ed school is only valuable to the extent that the teachers in those classrooms are top-notch educators themselves. (Ironically, the teacher I learned the most from during my many classroom hours in ed school was a TFA alum.) And if the coursework doesn’t cover essential aspects of teaching – like curriculum development and advanced coursework in the subject area, which my program completely left out – ed school becomes effectively useless.

  6. Roger Sweeny says:

    At the beginning of the year, most high school science classes begin with the idea of controlled experiments: you only allow one thing to vary so you can determine what (if any) effect it has. If you can’t actually run an experiment, you have to use rich data and do the best you can to “statistically control” for any difference between the groups. If you don’t control for all important variables, your research isn’t worth much.

    Unfortunately, the article doesn’t link to the research. But the Goldhaber quote seems to indicate that they haven’t controlled for differences in the people who enter the programs. If that is the case, the research is bad science. If it is science at all: it may have the same validiity as running a lot of numbers and finding that members of high school basketball teams are taller than average, but then concluding that joining the basketball team makes students taller.

    Unfortunately, failure to control for group differences is fairly common in ed research. It is one reason for the poor reputation ed research has.