Study: Evaluation works in DC

The District of Columbia’s teacher evaluation system — with rewards for the best and firing for the worst — is working, according to a a new study.  “Teachers on the cusp of dismissal under D.C.’s IMPACT evaluation system improved their performance by statistically significant margins, as did those on the cusp of winning a large financial bonus,” reports Ed Week.

 D.C.’s IMPACT evaluation system relies on a complex mix of factors to score each teacher, including both multiple observations and measures of student achievement. Teachers deemed ineffective under the system can be dismissed, while those scoring at the “minimally effective” level, the second lowest, get one year to improve. Those teachers who earn the “highly effective” rating are eligible for bonuses of up to $25,000. Earning successive “highly effective” ratings also permits teachers to skip ahead several steps on the salary scale.

Since its rollout, IMPACT has led to the dismissal of several hundred teachers.

The much-reviled Michelle Rhee started IMPACT when she was chancellor, jump-starting the evaluation program with foundation grants.

Are D.C. students learning more? The study didn’t look at student achievement.

About Joanne

Comments

  1. It sounds less like evaluation works than significant rewards or punishments work to get teachers to improve in relation to criteria that have not been demonstrated to be valid.

    • Oh, let’s just assume the criteria are valid since alternative is no criteria, no rewards or punishments and no effort to distinguish good teachers from bad.

  2. Roger Sweeny says:

    “Are D.C. students learning more? The study didn’t look at student achievement.”

    That is hysterically funny and unbearably sad.

  3. Roger Sweeny says:

    Hysterically funny because we are expected to judge the effectiveness of a teacher evaluation system without any consideration of student achievement.

    Unbearably sad because we are expected to judge the effectiveness of a teacher evaluation system without any consideration of student achievement.

  4. The study did look at the effect of the use of a metric (IMPACT) which included measures of student achievement. The study outlines on page 11 the various parts of IMPACT, which include a teacher’s contribution to student growth in performance on tests, taking student traits and background into consideration.

    It’s an included factor for reading and math teachers (“individual value added.”) For teachers in other subjects, there’s a “school estimated value added” measure. (Pages 11 & 12 in the study)

  5. Mike in Texas says:

    Ah, another “working paper”, published without any kind of peer review. To make the results fit what they wanted they invented a new term, ““school estimated value added””

    • cranberry says:

      The researchers didn’t invent the term. It’s an element of the IMPACT system; from briefly reading the study, only reading and math teachers’ value added scores can be imputed from test scores. There are multiple other factors in the system, including principals’ opinions of their collegiality and professionalism.

      • Mike in Texas says:

        Then the creators of IMPACT invented it. It still means nothing.

        • Roger Sweeny says:

          Well, of course, it means something. I suppose the important questions are, “does it mean something useful?” and “are there better measures?”

          The same questions could be asked about GDP, the unemployment rate, and your end-of-year grade in English.

    • As a connoisseur of fraudulent scholarship I’m sure you’ve got a union web site that firmly rebuts the conclusion in this study by invoking the names of the Koch brothers and other demons.

      By the way, Arne Duncan’s still the SecEd and he’s still running around talking charters and the whole “reformy” agenda. What’s up with that, hey??