Apples to applesauce?
Charter schools serving a general student population — not those that target disadvantaged or disabled students — show more improvement in test scores than nearby public schools serving similar students. So says a Manhattan Institute study called Apples to Apples. The difference is small but statistically significant.
Untargeted charter schools made math test score improvements that were 0.08 standard deviations greater than those of neighboring public schools during a one year period. For a student starting at the 50th percentile, this would amount to a gain of 3 percentile points, to the 53rd percentile. Reading test score results showed 0.04 standard deviations greater improvement in untargeted charter schools than in their closest regular public schools over the course of a year, a benefit that would raise a 50th-percentile student 2 percentile points to the 52nd percentile.
On Scientifically Correct, Bas Braams says the study is fatally flawed because it measures schoolwide improvement, rather than looking at how students did from one grade to the next.
It seems to me the schoolwide measurement would be an average of the grade-by-grade improvement. But, to paraphrase Inspector Oxford in Frenzy, statistical analysis is not the strong suit of the English major. (“Discretion is not traditionally the strong suit of the psychopath.”)
Update: Kimberly Swygert is also dubious about the study’s conclusions. The only part I understand is that charter students may differ from superficially similar students at nearby public schools because of the selection effect: Charter students may have more educationally savvy parents, or they may be kids who’ve done poorly in school, so their parents are seeking an alternative. The stats’ stuff still baffles me.