Study: Few gains in ‘gifted’ classes

Gifted and Talented magnet programs didn’t improve achievement in reading, math or social studies in a University of Houston study reported in Education Next.  Students in gifted programs did learn more science compared to similar students who just missed the the eligibility cut-off or those who qualified but lost a lottery.

Researchers aren’t sure why students didn’t accelerate their already high achievement when placed with high-achieving classmates in a more challenging program. One theory is that students who just made the cut-off were discouraged by going from the top of their old class to near the bottom of their new classes. Another is that high achievers who missed entry into the gifted magnets were able to find high-quality alternatives. The science gains may be a result of better teachers and well-equipped labs.

 

About Joanne

Comments

  1. The other is that many gifted curricula add breadth instead of depth. I learned to read basic hieroglyphics in 6th grade G&T ancient history. Somehow I doubt that showed up in any of my standardized testing.

  2. Catherine says:

    I’m not surprised to see this. My daughter goes part-time to a charter school that tries to cater to “advanced” students, but the curricula they use includes constructivist math (Everyday Math) and language arts methods that encourage things such as invented spelling. For the “bright”–not prodigiously gifted child–these methods are counterproductive. Bright children are not gifted enough to construct math on their own and they need math facts drilling, but they are certainly intelligent enough to be taught to spell and write properly. The school, while blessed with very well-intentioned staff and teachers, seems to fall short in teaching academic basics while still taking in many, many students that need those. (They can’t limit enrollment based on intelligence tests.) In my experience, gifted and talented programs seem to be for any bright child, not just the clearly gifted, even though the educational needs of these two groups of children are often quite different. My daughter falls somewhere between the border of these two groups, and I have to teach her math and language arts at home to make sure she actually learns them. Science “investigations” at school are a big hit with us both, though.

  3. Stacy in NJ says:

    Most G&T programs are crap.

    • Peace Corps says:

      The one at my son’s school is crap. He is a fourth grader. I don’t know the criteria used for selection, but about 35% of his class are in G&T. I thought that state law set the percent at no more than 10%. I guess I thought wrong.

      • Arbitrary percentage requirements across districts or states are also “crap”, just like the arbitrary percentages used in TX college admissions, and for the same reason. Student ability/performance varies widely across schools. Some schools may have few, if any, students that are able to meet hard qualifications (ITBS, SSAT, SAT etc) and some schools may have most of their students at that level. Certainly the latter was true at the ES my older kids attended. With that many high-performing kids, the curriculum should have been adjusted even more than it was (several levels of math and LA at each grade). Of course, GT programs may still be weak (we didn’t have our kids apply, for that reason), substituting artsy “enrichment” for more and deeper material and a faster pace, and they used the same weak curriculum.

  4. Christina Lordeman says:

    Let’s be careful not to lump “gifted and talented” and “magnet” into one category. There are many programs for gifted students within public schools (often offered as enrichment or occasional pull-out classes) which are highly beneficial to those students. Generally speaking, the value of a gifted program is a function of the aptitude of the teacher.

    We also need to take this data with a grain of salt. Is the goal of magnet schooling to improve students’ standardized test scores? Perhaps students in magnet schools perform no differently on standardized tests than they would if they attended regular public schools, but they have valuable educational opportunities that would otherwise not be afforded them. Just because a magnet school doesn’t improve test scores doesn’t mean it isn’t valuable.

  5. I’ve always wondered where Joanne gets her right-wing anti-public school talking points. I think I have my answer now.

    I wonder if she noticed the article did not appear to be peer-review and there was no detailed analysis available, just a fluff-piece about their results (try clicking on the “No PDF” link).

    • Michael E. Lopez says:

      You can’t be bothered to do SEVENTEEN SECONDS of research.

      Now I fully admit that the article is a fluff piece about their results. That’s why it’s in EducationNext — which is precisely the sort of resource that edu-bloggers like Joanne use for their informal, non-academic work. But you might think about using Google before you start ripping into people.

      The study in question hasn’t been peer-reviewed formally yet because it’s still preliminary research. That’s why it’s a “working paper” and why it’s only been posted to NBER and SSRN.

      http://www.nber.org/papers/w17089.pdf
      http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1854191

      In other words, this is a serious study by three economists that is in the midst of getting the sort of peer review that you might expect. This is how the academy works in the digital age with these cool “internet” things. Pieces in non-academic sources like EducationNext are a way to bring these ideas to a wider audience.

      That Joanne links to the informal piece is merely a function of her wearing her blogger hat rather than her academic hat. (If she has one… I honestly don’t know.)

      And finally, the reason the link says “No PDF” is because there’s no bloody PDF. What do you expect it to link to? The author of the article (not the study… just the article) probably saw the abstract on SSRN and assumed that, like many other papers posted there, the entire article wasn’t available. Or they were just being lazy.

      Like someone else I can think of.

      • I don’t have an academic hat.

      • Michael E. Lopez, as far as I can tell, the authors of the article, Sa Bui, Scott Imberman and Steven Craig, are the authors of the study.

        Mike in Texas, if you are interested in courteous debate with your hostess, rather than pointless insults, you could track down the papers online yourself. It’s really not hard. Most researchers provide links to publications and works-in-progress on their university web pages.

        • Michael E. Lopez says:

          I’m quite aware of that. Sorry if I wasn’t clear that when I said that fluff articles are a way to bring something to a wider audience, I meant a way for those authors to bring their stuff to a wider audience.

  6. Deirdre Mundy says:

    I went to a math/science/CS/engineering magnet, and I can safely say that my scores on state tests did NOT improve as a result. Why? Because I was already in the 99th percentile, as were all my classmates. When you’re learning advanced math and science, you need advanced tests to measure the gains. On the other hand, I’m confident that without the magnet I wouldn’t have gotten as many 5s on APs as I did. And when I got to college, the magnet experience meant that my first quarter at the University of Chicago was not any harder than high school, even though I’d been ‘average’ in my high school –meanwhile the kids who’d gone to normal schools and who were at the top of their class got walloped……

    • Good point. There’s a good reason why some HS magnet programs use SATs as part of their admission process for potential freshmen; tests at lower levels simply do not discriminate well enough at the top. Even the SATs have that potential. At the end of a top HS magnet, how does one really measure learning in a kid who scored in the mid-700s in both math and verbal in the fall of his 8th-grade year?

  7. I don’t know why these studies are being pushed. I’ don’t know why it merits publication in Education Next.

    The lottery study compared G&T students who won admission to over-subscribed G&T programs to students who lost that lottery, and thus remained in their “neighborhood G&T programs.” Some won admission to other magnet schools, and some attended charter schools.

    At any rate, it’s not a comparison between G&T programs, and average school programs. It’s a comparison between two over-subscribed G&T programs, and other G&T programs or other magnet/charter options. It’s like saying, well, these students were selected to study with the Bolshoi Ballet, while these had to remain in their local, high-pressure ballet school. After two years, the groups were equal in performing jumping jacks. Therefore, ballet training makes no difference.

    The paper seems to take it on faith that the teachers are better in the over-subscribed G&T programs. Well, how do they know that?

    The criteria for admission into the gifted program are…mixed. Figure 1 in the paper reproduces a score sheet for G&T qualification. These criteria produce a heterogeneous (not homogeneous) group. There are “obstacle points,” which are not related to academic achievement. Teacher recommendations play a part. From the paper, “For example, students who score precisely at the eligibility threshold range from 45 to 97 national percentile rankings in reading and between 55 and 98 percentiles in math on the 5th grade exams.”

    And, 18.8% of the lottery losers left the public system. If we assume they’d be more likely to be affluent enough to move or pay private tuition, their leaving the system could itself have an effect.

    • Michael E. Lopez says:

      I just wanna point out, especially in light of Cranberry’s cogent criticism, that I never said the study was any good… merely that it was serious academic work.

      :)

  8. Cranberry says:

    I think part of my problems with the study stem from the authors’ decision not to identify the school district. (Which condition may have been set by the district.) The first question which arises is, is the school system’s gifted and talented program effective? Is it enrichment, or acceleration? The extra points awarded to certain students probably stem from a desire to create a heterogeneous group when ethnicity, SES, language skills are taken into account. On the other hand, I have a problem with creating a program within a school system which segregates the kids who rack up high scores on standardized tests with students who behave well in class. Who’s left outside this golden group? Even if there are pragmatic reasons to create such divisions, wouldn’t it be likely for the teachers with more seniority to gravitate toward teaching the students who have been dubbed the easiest to educate?

    That some parents opt out of the program makes me wonder. I am not concerned about the decrease in grades for students at the edge of acceptance. That’s a well-known effect, that changing to a more competitive group will decrease one’s grades–happens in prep schools all the time. The grades have to be lower, because there has to be some way to acknowledge the achievement of the kids at the top of the curve. (I’m not a big self-esteem fan.)

    The kids at the margin did take more advanced courses later in their school careers, which probably reflects the district’s gatekeeping practices. If you come from the G&T program, that’s likely a plus when applying.

    The study authors didn’t choose the Education Next article title. In a glass half-full fashion, one could have chosen to point out that, given parental choice, students who didn’t get into the premier gifted programs in the district did just as well in other district programs. I don’t think any sensible parent applies to only one school, when there’s a chance her child wouldn’t be accepted, so it’s not necessarily a given that all the “lottery losers” regarded themselves as losers.

    For a study of the effect of different high schools on performance, I prefer NBER Working Paper No. 17264, “The Elite Illusion.” I’m reading it now. The study did find similar results, in that Exam School attendance did not markedly increase achievement.

    • Deirdre Mundy says:

      Cranberry–Megan McArdle highlighted that paper a few weeks back. One huge problem in the methodology was that they used SAT scores to measure “achievement,” while most exam schools take kids who are already near the top of the pack, and really, the difference between an 800 math and a 740 math is just “Who was having a better morning on test-day”. AP scores are a slightly better measure, but material covered still tops out BELOW what most exam schools cover in Junior and Senior year. (For example, we were encouraged to take the AP CS exam in 9th or 10th grade, because those were the last years that classes were at AP exam level.)

      I think a more interesting (but more difficult study) would be to look at how kids from exam schools did at highly selective colleges compared to their similarly IQed peers who didn’t go to an exam school. (And you’d have to control for opportunities, too. A wealthy child who doesn’t go to an exam school may still get the same opportunities elsewhere, while a middle or working class kid may not have the same chances since their parents can’t pay for classes above what the typical HS offers.)

      I know that, personally, being the “average” kid at an exam school put me in a better place for college than my friends who were the smartest kids at a normal school…. not because I was brighter, but because after 8 classes a day of college level work + the attendant homework, a college schedule seemed almost relaxing! Meanwhile the kids who were used to slacking and busywork got hit with a ton of bricks. And even GOOD teachers in a normal school can’t realistically provide the level of challenge exam-school-type kids need while serving the needs of other students. (The program I was in used mostly test scores at the time–since then, politicians have monkeyed with it to make it more ‘fair’ which means that there’s more emphasis on teacher reccs and extracurriculars, a mistake IMO. It used to take the top 100 test scores for the district, which did lead to a very intense but interesting environment.)

  9. Cranberry says:

    Deirdre Mundy, I think you have an inflated view of the performance of exam school students, probably influenced by your personal experience of an exceptional program. If you look up the Boston Latin School’s test performance on the Massachusetts Department of Education’s website, neither the average SAT scores, nor the AP test scores are stellar. Mid-to-low 600 SAT scores are on a level with a good suburban high school. The AP exam results are patchy, not the sort of overall picture one would expect from a school which taught more advanced stuff in Junior and Senior year.

  10. Deirdre Mundy says:

    That could be… I went to a place where “four” was considered “failed” on the APs. And that compressed 4 years of science into 2 years to allow students to take higher level classes. And 100 kids out of a densely populated suburban county is really a tiny, tiny percent. How many kids does Boston Latin admit? And are they a “straight exam” school, or do they do the whole “holistic” thing? I think “holistic” definitely ends up watering down programs….

  11. The schools used the Stanford Achievement Test not the SAT.

    • Cranberry says:

      The schools in the original study in the Southwest reported in Education Next used the Stanford Achievement Test. The schools studied in the “Elite Illusion” included Boston Exam schools. SAT (formerly known as the Scholastic Aptitude Test) data, AP test results, and demographic data are available for all Massachusetts schools through the Massachusetts Department of Education’s website.

    • Michael E. Lopez says:

      I know what you mean, Joanne… but you have to admit that what you wrote looks funny.

      “It’s not the SAT! It’s the Sasquetchewan Academic Test!”

      “It’s not the FBI! It’s the Farmer’s Biannual Investigation!”