Jerry Brown: Data is useless

School performance data is a “siren song for school reform,”  (pdf) wrote California Gov. Jerry Brown in vetoing a bill to add “multiple indicators,” such as graduation rates, to the state’s Academic Performance Index.

This bill requires a new collection of indices called the “Education Quality Index” (EQI), consisting of “multiple indicators,”many of which are ill-defined and some impossible to design. These “multiple indicators” are to change over time, causing measurement instability and muddling the picture of how schools perform.

SB547 would also add significant costs and confusion to the implementation of the newly-adopted Common Core standards which must be in place by 2014. This bill would require us to introduce a whole new system of accountability at the same time we are required to carry out extensive revisions to school curriculum, teaching materials and tests. That doesn’t make sense.

Finally, while SB547 attempts to improve the API, it relies on the same quantitative and standardized paradigm at the heart of the current system. The criticism of the API is that it has led schools to focus too narrowly on tested subjects and ignore other subjects and matters that are vital to a well-rounded education. SB547 certainly would add more things to measure, but it is doubtful that it would actually improve our schools. Adding more speedometers to a broken car won’t turn it into a high-performance machine.

Over the last 50 years, academic “experts” have subjected California to unceasing pedagogical change and experimentation. The current fashion is to collect endless quantitative data to populate ever-changing indicators of performance to distinguish the educational “good” from the education “bad.”

. . . SB547 nowhere mentions good character or love of learning. It does allude to student excitement and creativity, but does not take these qualities seriously because they can’t be placed in a data stream. Lost in the bill’s turgid mandates is any recognition that quality is fundamentally different from quantity.

There are other ways to improve our schools — to indeed focus on quality. What about a system that relies on locally convened panels to visit schools, observe teachers, interview students, and examine student work? Such a system wouldn’t produce an API number, but it could improve the quality of our schools.

Actually, I doubt it.  Maybe a state school inspector could evaluate school quality without student performance data by looking for signs of good character and love of learning.  Maybe not. A local committee would be easy to snow.

The vetoed bill, SB 547, had broad support, notes John Fensterwald of Educated Guess. The proposed Education Quality Index could have included “dropout rates, the need for remediation in college, success with career technical education programs, and graduation rates.” Standardized test scores would have counted for no more than 40 percent of the score in high school. While critics “questioned whether the EPI would be too squishy,” Brown complained “it would have demanded more of the same, hard data.”


About Joanne


  1. What no one is mentioning here is the time taken out of the school year to assess the new standards and curriculum. In my district, one of the largest five in California, our department now has 18 (!) days total devoted to a combination of CST testing, district assessment, and department-wide testing.

    That’s almost a full month of instructional time lost to testing. The school year has not increased, and we stand to lose up to five more days due to furloughs.

    I don’t know whether to laugh or cry. We’re heaped with tests that we don’t have time to prepare students for, and then we’ll be chastised (sp?) for lower test scores.

    Gotta go. Joseph Heller’s on the line.

  2. Agreed. I have yet to see a single test or scores report in my state (NY) that takes into account factors beyond a school’s control. I’ve worked in five districts over seven years, from an inner city school plagued by violence, to a depressed blue collar town, to a high income university city, and even the stereotypical middle class suburb. The demographics of the community holds the most sway over each cohort’s performance and improvement.

    Add to that the addition of confusing and ever-changing assessments that force many schools to game the system like a bunch of Vegas oddsmakers, and interference by outside parties (state ed and unions) with hiring, firing, and curriculum, and in the end very little authority is left within the actual district.

    So, while data is definitely useful within the school to guide lessons, applying it in a general sense to compare anything beyond individual students is folly and will do more harm than good.

  3. Brown makes a good point, that much of the data being relied upon by school reformers is actually pseudo-data. As you say, it’s easy to snow a local committee. Pseudo-data makes it child’s play.