Charters score well in NY, California

Most New York City charter schools are outperforming schools in their district.

When compared to the overall scores for the school districts in which they are located, some charter schools — such as Bronx Preparatory in the South Bronx and the KIPP Infinity school in Harlem — had as much as double the portion of students scoring proficient in math and reading.

Also outperforming district schools are two charter schools opened by the city teachers union.

Overall, test scores rose so much in city schools that some suspect the test isn’t valid.

Excluding virtual schools, California charter schools are doing well, reports EdSource. Classroom-based elementary charters do about as well as district-run schools when results are adjusted for demographics. Classroom-based charter middle and high schools, which serve more disadvantaged students, have higher Academic Performance Index scores than noncharters.

Virtual charters have lower math scores. Schools run by Charter Management Organizations tend to be high performers, and these CMO-run schools make up an increasing portion of the total.

About Joanne


  1. Cardinal Fang says:

    The New York test results for all schools, including charters, are bogus. The state test shows enormous progress everywhere. Meanwhile, the national test, the NAEP, shows no progress. To me, that’s a clear indication that the state test is dumbed down so that the schools will look good.

    Also, “Most New York City charter schools are outperforming schools in their district” doesn’t tell us a lot. We want to know the mean score for all charter schools and all non-charter schools.

    Anyone can cherry-pick data to make their side look good. It’s not enough to hear that most charter schools are outperforming the mean non-charter school. Maybe most non-charter schools are also outperforming the mean, and in both cases some terrible schools are dragging the average down.

    I support charter schools, but I also support honest reporting of results.

  2. Margo/Mom says:


    I have been reading about NY and the scores and lots and lots of allegations. My thinking is that the least likely scenario is that someone at the state level was able to pull of a presto-chango on the test material resulting in an uptick. Some other things might have to do with where the cut-scores lie and where efforts were focused. In terms of proficiency we are really looking at a Y/N kind of rating. Based on a cut score, students are either Y (proficient) or N (not proficient). Depending on the spread of scores, there could be a significant change in the number of students in the Y category without a significant change in scores overall (for instance if there were lots of kids scoring barely below proficient instead scored barely proficient) It gets a bit more specific when you look at levels (since there are 4 instead of 2). I understand that the percentage at the top level declined–which would argue against a dumbed down test.

    I don’t know about NY, but some states try to guard against enhanced focus on the “bubble kids” (the ones just below the line) through use of some kind of proficiency index–which gives enhanced weight to scores by level. This helps to incentivize attention to improvement of all students (as well as minimizing the numbers not tested by giving a greater weight to a minimal score than to a non-tested student).

    I have also seen charges fly in the opposite direction. Our local superintendent announced publicly that a downturn in scores in one area was due to a “more difficult” version of the test being used in that year. Unless I am sadly mistaken, New York has been in the testing business for a very long time and knows about psychometricians and other techno-folks who are in the business of equating tests from year to year. I know that this is true in my state.

  3. Margo/Mom–

    Have you been reading Deborah Meier and Diane Ravitch in their “Bridging Differences” blog? If so, you’d be less accepting of the New York test results. I’d certainly be skeptical, and even more so after four years of teaching in my state (which does have a reasonably rigorous statewide assessment).

    You also need to take a look at what the actual percentages are that result in a student being marked “proficient.” This year I had one parent who was skeptical that their child had a reading problem because the state test scores in other states showed the child as consistently proficient. Well, when you looked at the actual percentage of correct answers, the child was consistently only getting 35-45% correct answers. But because the state labeled it as “proficient,” the parent wasn’t worried.

    So the child comes to our state and doesn’t do well. It took a DRA reading score performed by another professional to convince the parent–which reflected our diagnostic testing that showed while the student could decode from now until tomorrow, the student didn’t have the faintest clue what it meant. The student had learned to read, but didn’t know how to read to learn.