Common tests lose support

Forty-five states and the District of Columbia are moving forward on Common Core Standards, but support for common testing is eroding, reports StateImpact.

Georgia will use its own exam, instead of the costlier test developed by the Partnership for Assessment of Readiness for College and Careers (PARCC).

Two of Florida’s top elected leaders want Florida to leave PARCC, even though Florida is the fiscal agent for the testing consortium.

Already Alabama, North Dakota and Pennsylvania have left the consortium. Oklahoma plans to design its own test, and Indiana isn’t participating in PARCC governing board meetings right now. State education officials say they’re waiting until after a mandatory legislative review of the Common Core academic standards.

That brings the number of states participating in PARCC down to 18 plus the District of Columbia.

Pennsylvania, Utah and Alabama quit the other testing group, Smarter Balanced Assessment Consortium, which now has 24 members. (Some states had joined both groups.)

The crumbling of the testing consortia is a “disaster,” writes Andy Smarick on Flypaper.

At this point, I won’t be surprised if we end up with 20 or more different testing systems in 2014–15. So much for commonness, so much for comparability. Rigor and alignment with tough standards are likely the next to fall.

Blinded by “technocratic hubris,” common assessment advocates “underestimated how difficult it would be to undo decades of state policy and practice on tests,” writes Smarick. Governors and state chiefs will be reluctant to spend lots of money for a testing system that will make their schools and teachers look bad, he predicted six months ago.

The Common Core sky isn’t falling, responds Checker Finn, also a Fordhamite. This is “right sizing.”

The forty-five-state thing was always artificial, induced by Race to the Top greed and perhaps a crowd mentality. Never in a million years were we going to see forty-five states truly embrace these rigorous academic expectations for their students, teachers, and schools, meet all the implementation challenges (curriculum, textbooks, technology, teacher prep, etc.), deploy new assessments, install the results of those assessments in their accountability systems, and live with the consequences of zillions of kids who, at least in the near term, fail to clear the higher bar.

It’s “better for states to drop out in advance than to fake it, pretending to use the Common Core standards but never really implementing them,” Finn writes. “That’s long-standing California-style behavior (fine standards, wretched implementation), in contrast with Massachusetts-style behavior (exemplary standards and serious implementation—and results to show for it).”

Most of the drop-out states will keep the standards, but write their own tests or sign up with ACT. They’ll give comparability, “one of the major benefits of commonality,” Finn writes. Some may change their minds later “or face up to the fact that (like Texas and Virginia) they don’t really want to use the Common Core at all.”

New ‘core’ test will be (a bit) shorter, simpler

One group developing tests aligned to new standards will make exams shorter and simpler — but less capable of providing detailed feedback on students’ performance, reports Ed Week.

The Smarter Balanced Assessment Consortium, one of two groups crafting tests for the Common Core State Standards, will include only one lengthy “performance task” in each subject—mathematics and English/language arts. The test will include multiple choice, short response and “technology-enhanced” questions. But it won’t be all that quick: SBAC estimate seven hours of testing in grades 3-5, 7½ hours in grades 6-8, and 8½ hours in grade 11.

A performance item might ask students to “tackle longer, more complex math problems and write essays based on reading multiple texts,” reports Ed Week.

In this version, students will evaluated on  math concepts and procedures, communicating reasoning, and problem-solving/modeling/data analysis and on reading, writing, listening, and research.

“Is it about getting data for instruction? Or is it about measuring the results of instruction? In a nutshell, that’s what this is all about,” said Douglas J. McRae, a retired test designer who helped shape California’s assessment system. “You cannot adequately serve both purposes with one test.”

That’s because the more-complex, nuanced items and tasks that make assessment a more valuable educational experience for students, and yield information detailed and meaningful enough to help educators adjust instruction to students’ needs, also make tests longer and more expensive, Mr. McRae and other experts said.

Separate formative and interim tests can help teachers figure out what students need to learn, while the end-of-the-year test is used for accountability, say SBAC designers. Teachers will be able to use an online bank of test questions and tasks and a bank of “formative” tools to judge students’ learning.