Math Expressions, Saxon Math show results

First graders taught with Math Expressions or Saxon Math learned significantly more than students taught with Number, Data, and Space (Investigations) and Scott Foresman-Addison Wesley Mathematics (SFAW)  in a Mathematica-SRI study for the U.S. Education Department.

An average-performing student’s percentile rank would improve by 9 to 12 points if the school used Math Expressions or Saxon instead of Investigations or SFAW, the randomized study found.  The high-performing programs improved math results in high-poverty schools and in schools with low math scores.

About Joanne


  1. Using this as a justification for Saxon Math would be similar to…

    “People fed a diet consisting of pork rinds appear to be less in danger of starvation then people fed a diet of pine straw. Researchers found that the high fat, low nutrient pork rinds appear to have calories that provide the body with energy. Individuals eating a diet of pine straw have found it difficult to absorb anything meaningful from their daily allotment of half a bushel.”

  2. I am unsure why Tom says what he does — I can’t see any parallel.

    The four programs are very different. SFAW is a middle of the road program of the type that has been used around the country for a decade or more. Has real content but the presentation is choppy and disjoint; little cohesion and continuity from day to day, lots of visual clutter on the page. Saxon is a very cohesive and focused traditional program, with superbly(!) done practice sections that are conducive to long-term retention. Investigation is a major reform text from late 1990s, light on practice and content but heavy on long broad and deep project activities. I know the least of Math Expressions, but it seems to be a second generation reform text that seems to have more content than Investigations, and it seems to reject spiraling in favor of mastery and fluency.

    The evaluation seems to be methodologically excellent, and the findings show statistically significant large effects sizes. Both are rare (or better — non existent) in curriculum studies. We should wait for more reporting and higher grade results, but finally a study one doesn’t blush when one reads it.

  3. This study is good news (assuming that it’s not flawed in some way, I haven’t dug around in it to check).

  4. Brandyjane says:

    We’ve used Saxon in the first through fifth grades at our school for several years and have seen very positive results. Unfortunately, Saxon was bought out by one of the big textbook makers (my brain is too tired to remember which one right now), and they’re changing it up a bit to make it more like everything else that’s out there. Now we’re experimenting with Classical Math, which is basically Saxon the way it used to be. So far the parents and teachers are pleased. (I teach junior high, and I don’t teach math, so this is basically the extent of my knowledge about Saxon.)

  5. They mentioned that the teachers in each category were given training by representatives of the publisher. I wonder if the differences between texts could be due to the differing quality of this training, rather than or in addition to differences in the quality of the texts themselves.

  6. Saxon seems to be of of those “love it or hate it” things. Some kids really do well with all the repetition it includes but others can’t stand it for that exact reason.

    I wish the researchers had included Singapore Math in the study.

  7. No study is perfect, but this one comes as close as one can realistically imagine.

    Regarding training by the rep: As far as I know, the participating schools were given training by different trainers, or as the report says (top of p. 28): “These trainings typically were spread across many representatives from each publisher.” Whatever training quality is an attribute of the program itself, it deserves to be associated with it. Training quality associated with the trainers would be mostly factored out by the variability among multiple trainers. One of the good things about the study design was that each school piloted all 4 programs, so all the program-unspecific support (e.g., math coaches) was equally available to all teachers if they wanted to use it.

    We may hear more on this in time, as the implementation & class observation report is forthcoming.