Catching up to the world? Not so much

Despite gains in reading, math and science, U.S. students remain in the middle of the international pack, concludes a Harvard analysis in Education Next.

From 1995 to 2009, the U.S. ranked 25th out of 49 nations in fourth- and eighth-grade test score gains in math, reading, and science. In the fastest improving countries, Latvia, Chile, and Brazil, students are improving at nearly three times the U.S. rate. Portugal, Hong Kong, Germany, Poland, Liechtenstein, Slovenia, Colombia, and Lithuania are improving at twice the rate of U.S. students.

Over the 14-year period, U.S. fourth- and eighth-graders raised their NAEP scores by nearly “the equivalent of one additional year’s worth of learning.”

Yet when compared to gains made by students in other countries, progress within the United States is middling, not stellar. While 24 countries trail the U.S. rate of improvement, another 24 countries appear to be improving at a faster rate. Nor is U.S. progress sufficiently rapid to allow it to catch up with the leaders of the industrialized world.

Within the U.S., Maryland, Florida, Delaware, and Massachusetts improved at two to three times the rate of Iowa, Maine, Oklahoma, and Wisconsin.

A fraction of the U.S. gains can be attributed to “catch-up” theory:  Low achievers have more room for improvement. Spending more on education had little effect.
. . . on average, an additional $1000 in per-pupil spending is associated with an annual gain in achievement of one-tenth of 1 percent of a standard deviation. But that trivial amount is of no statistical or substantive significance. Overall, the 0.12 correlation between new expenditure and test-score gain is just barely positive.

The gains in elementary and middle school fade by high school, the authors write. U.S. 17-year-olds have shown “only minimal gains” over the past two decades

About Joanne

Comments

  1. U.S. test scores have always sucked compared to other developed nations’, and it’s a fatal flaw in any report about the subject to omit that crucial point. Reformy propaganda generally falsely claims (or sneakily implies) otherwise.

    Also, by the way, pre-emptively: It’s invalid, always and at all times, to claim that the U.S. has lower high school grad rates than other nations. That’s because in other developed nations, students on non-college secondary school tracks legitimately graduate from school at or around age 16, after the equivalent of our 10th grade. This is the case in some, many, most or all developed nations — as an unpaid volunteer I don’t have the wherewithal to research that — however, if it’s the case in ANY, it means it’s not valid to compare U.S. grad rates to those of other developed nations.

    In other words, a student who leaves high school at age 16 in the Netherlands, Switzerland and other nations may be a graduate, while a student who leaves high school at age 16 in the U.S. is a dropout.

    • Oh grow up. You’re not in charge, you don’t decide what’s valid or invalid and, oh by the way, the U.S. has lousy test scores compared to nations that spend less, in some cases much less, then the U.S. does. Among the OECD countries only Switzerland spends more but then they’re also getting more for their money.

      Hey, maybe that’s why so many people in the U.S. think the public education system sucks!

      Even without expertise or oodles of studies people understand they’re being screwed. That sort of thing does tend to cause a hardening of attitudes towards failure no matter how artful the excuses.

  2. The report doesn’t claim U.S. scores are worse than in the past compared to other countries, nor does it compare U.S. high school graduation rates with rates in other countries. In fact, it explicitly excludes 17-year-olds because the data isn’t comparable.

  3. Joanne, I think the point that CarolineSF was trying to make is that both you and the authors left out the ‘SO WHAT?’ issue. Omitting the fact that the trends recently reported are similar to most international reports over the past two decades ignores the fact that no one seems to be able to prove rich connections between national/state test score averages and either economic success or societal well-being. So why should we care how our test scores rank compared to other nations? That’s the issue that was omitted by the Harvard researchers and yourself. Want to take a stab at that one (with references to peer-reviewed, not just ideological, research)? I know many of us would love it if you would…

  4. I’m in the “so what?” category. In researching statistics, one has to do an apple to apple comparison.

    In mortality statistics, for example, some countries do not count infants for up to 30 days after birth. However, the US counts them from birth. Who would you expect to rank better? Certainly, not the US.

    These types of rankings never give enough information to make that apple to apple or what we would call a valid comparison.

    • A side issue here, but – in response to your comment – some countries (including some European) do not count infants under a certain weight or gestational age as live births. Those infants also do not receive the intensive care that such infants receive here – despite the wonders of universal, national health care. Also, some countries do not count births until the child reaches 1 year. While managing my son’s travel soccer team, I encountered an African team where the birthdate registered on the international FIFA registration card had a January 1 birthdate for every player. I was informed that if the child survived his first year, he was then registered as a live birth on the following Jan 1. Looking at the team, it was obvious (from their more-advanced physical maturity) that they were significantly older than American teams of the “same” birth/FIFA year. (FIFA uses Aug 1, not Jan I, to define year groups).

  5. Richard Aubrey says:

    momoffour.
    You know it, and I know it, but the fact remains that the difference is frequently used to “prove” the US sucks at infant health.