Will new tests live up to the hype?

Muslim Alkurdi, 18, of Albuquerque High School, joins hundreds of classmates in Albuquerque, N.M, Monday, March 2, 2015, as students staged a walkout to protest a new standardized test they say isn't an accurate measurement of their education. Students frustrated over the new exam walked out of schools across the state Monday in protest as the new exam was being given. The backlash came as millions of U.S. students start taking more rigorous exams aligned with Common Core standards.

Muslim Alkurdi, 18, of Albuquerque High School, joins hundreds of classmates, as students staged a walkout to protest a new exams.

In 2010, U.S. Secretary of Education Arne Duncan promised teachers that Common Core-aligned Assessments 2.0 would be the tests they had “longed for.”

Millions of students are taking those new tests this spring, writes Emmanuel Felton on the Hechinger Report. Enthusiasm for the new tests has waned.

The federal government put $360 million into the Partnership for Assessment of Readiness for College and Careers (PARCC) and the Smarter Balanced Assessment Consortium, which developed Core-aligned tests.

This spring, of the original 26 states that signed up for PARCC, just 11 plus Washington, D.C. are giving the test. Of the original 31 signed up for Smarter Balanced, only 18 are still on board. (In the early years, some states were members of both coalitions.) Several of the states will give the PARCC or Smarter Balanced test for one year only, before switching to their own state-based exams next year. Another Common Core exam, known as Aspire, produced by ACT, has stolen away some states from the federally sponsored groups; this spring students in South Carolina and Alabama will take that test.

On the old state tests, only 2 percent of math questions and 21 percent of English questions assessed “higher-order skills,” such as abstract thinking and the ability to draw inferences, concluded a 2012 RAND study of 17 state tests.

Two-thirds of PARCC and SBAC questions call for higher-order skills, according to a 2013 analysis by the National Center for Research on Evaluation, Standards, and Student Testing.

“In the old tests a student would just get a vocabulary word by itself and would be asked to find a synonym,” said Andrew Latham, director of Assessment & Standards Development Services at WestEd, a nonprofit that worked with Smarter Balanced and PARCC on the new tests. “Now you will get that word in a sentence. Students will have to read the sentence and be able to find the right answers through context clues.”

The new tests require students to answer open-ended questions, which takes more time.  Smarter Balanced will take eight and a half hours, while some PARCC tests will take over ten hours.

Duncan had promised teachers would get quick feedback from the new tests, but it takes time to grade students’ writing. The only way to get fast feedback is to use robo-graders instead of humans.

NJ eyes automated test-grading

New Jersey is considering using robo-graders to evaluate essays on Common Core-aligned tests, reports NJ Advance Media.

Students will type short essays on the computerized Partnership for Assessment of Readiness for College and Careers (PARCC) exams. This year, they’ll be graded by humans — “but 10 percent of online essays will get a ‘second read’ by a computer to test the viability of automated scoring in the future.”

Computer grading is cheaper and returns scores quickly to students and their schools.

Testing fail

Steve Rasmussen, an education consultant, has written a devastating critique of the Smarter Balanced Assessment Consortium (SBAC) math tests that will be administered to more than 10 million students in 17 states.

Citing test items, he concludes that many violate the standards they’re supposed to assess, can’t be answered with the technology provided, use confusing and hard-to-use interfaces and will be graded “in such a way that incorrect answers are identified as correct and correct.”

Parents are right to boycott the SBAC test, Rasmussen writes.

As you’ll see as you look at these test items with me, a quagmire of poor technological design, poor interaction design, and poor mathematical design hopelessly clouds the insights the tests might give us into students’ understanding of mathematics. If the technology-enhanced items on the Smarter Balanced practice and training tests are indicative of the quality of the actual tests coming this year — and Smarter Balanced tells us they are — the shoddy craft of the tests will directly and significantly contribute to students’ poor scores.

Teachers will need to prep students on how to use the confusing tools, he adds.

Elizabeth Willoughby, a fifth-grade teacher in Michigan, has posted a video of her tech-savvy students struggling to figure out how to enter numbers on a practice test.

PARCC, the other federally funded testing consortium, also has produced a confusing, poorly designed exam, according to Save Our Schools NJ. “In the early grades, the tests end up being as much a test of keyboarding skills” as of English or math competence, the group argues.

As a farmer, Colorado State Sen. Jerry Sonnenberg uses math to analyze “cost, production and profit, and quite often, loss,” he wrote. He got the right answers on the PARCC practice math test, but failed because he didn’t “show my work” in the approved way, he complains. Sonnenberg also struggled with the software.

Florida dumped PARCC and scrambled to create its own exam. The rollout of the computerized test created a “catastrophic meltdown,” Miami-Dade Superintendent Alberto Carvalho told the Miami Herald.

Teachers give low grade to PARCC exam

PARCC — the biggest Common Core testing consortium — has put sample test questions online.

Teacher Peter Greene, who blogs at Curmudgucation, found lots of problems with the practice test for high school English.

To start with, PARCC must be taken on a computer. It’s “a massive pain in the patoot,” writes Greene.

 The reading selection is in its own little window and I have to scroll the reading within that window. The two questions run further down the page, so when I’m looking at the second question, the window with the selection in it is halfway off the screen, so to look back to the reading I have to scroll up in the main window and then scroll up and down in the selection window and then take a minute to punch myself in the brain in frustration.

Teachers will have to prep students to handle the format.

Questions focus very heavily on finding things in the text that support answers. The first question asks which three out of seven terms in the text on DNA testing in agriculture “help clarify” the meaning of  “DNA fingerprint.”

If I already understand the term, none of them help (what helped you learn how to write your name today?), and if I don’t understand the term, apparently there is only one path to understanding. If I decide that I have to factor in the context in which the phrase is used, I’m back to scrolling in the little window . . . I count at least four possible answers here, but only three are allowed. Three of them are the only answers to use “genetics” in the answer.

I tried the practice reading test for grades 3-5. I picked the meaning of “master” with no trouble. Which sentence — out of four choices — helped me do so? None of them.

When the high school test moves on to literature, it demands that poetry has one meaning only, complains Greene.

Reading the text closely is a waste of time, he writes. He can do better by reading the questions and answers closely, then using the text “as a set of clues about which answer to pick.” 

Another section features Abigail Adams’ letter to John Adams calling for women’s rights. Questions focus on “her use of ‘tyrant’ based entirely on context,” Greene writes. “Because no conversation between Abigail and John Adams mentioning tyranny in 1776 could possibly be informed by any historical or personal context.”

In short, he concludes PARCC is “unnecessarily complicated, heavily favoring students who have prior background knowledge, and absolutely demanding that test prep be done with students.”

PARCC won’t produce reliable results, writes Michael Mazenko, a Colorado teacher. He tried the seventh-grade reading test, which contains passages from The Count of Monte Cristo.  That’s too hard for seventh graders, Mazenko writes.

And, like Greene, he thinks the computerized format strongly favors the most computer-savvy students.

A dad opts in to Core testing

Greg Harris, an education writer and parent, is opting in to Common Core testing, he writes on Education Post.

Core teaching  will “promote the 21st century skills needed to navigate and thrive in a complicated world,” he believes.

In addition to practicing addition and subtraction, his first grader created his own word problems and math exercises, writes Harris. He drew his “problem-solving process with crayons.”

His older son’s homework, which is aligned with Ohio’s Core reading standards, includes:

Quote accurately from a text when explaining what the text says and when drawing inferences from the text.
Determine a theme of a story, drama, or poem from details in the text, including how characters in a story or drama respond to challenges or how the speaker in a poem reflects upon a topic; summarize the text.
Compare and contrast two or more characters, settings, or events in a story or drama, drawing on specific details in the text (e.g., how characters interact).

When his son read Caddie Woodlawn, “he wasn’t asked to memorize passages, respond to fill-in-the blank questions, or answer true or false questions,’ writes Harris. Instead, he analyzed what he read and wrote responses to questions. He was expected to “break down chapters by their main themes and cite supporting evidence from the text to back up his main ideas.”

The PARCC tests my kids will take this year will determine their absorption of this way of learning. Teachers teach to the tests far less, but rather impart skills that will help their pupils learn to write and write well, conduct analysis and solve problems.

Students will struggle at first, but they’ll “rise to the challenge,” Harris writes.

Why suburban moms fear the Core

Hysteria about Common Core teaching and testing has gripped suburban moms, writes Laura McKenna in The Atlantic. She likens it to anti-vaccination fears.

Millions of children will take new Core-aligned tests this spring.  “Conspiracy theories . .  .have grown out of parents’ natural instinct to protect their children from bureaucracies and self-styled experts,” writes McKenna, who’s a suburban mom herself.

White, middle-class parents, often very involved in their kids’ education “worry that they won’t be able to help kids with homework, because the new learning materials rely on teaching methods foreign to them,” she writes. They feel powerless to stop the juggernaut.

Social media fans the fears.

There are those Facebook posts promoting articles with click-bait titles like “Parents Opting Kids Out of Common Core Face Threats From Schools,” or “Common Core Test Fail Kids In New York Again. Here’s How,” or “5 Reasons the Common Core Is Ruining Childhood.”

 I can picture it in my head: articles with stock photos of children sitting miserably at a desk or ominous images of broken pencils.

Teachers across the country, including those in her suburban New Jersey district, are turning against the Core, especially if scores are tied to teacher evaluations, writes McKenna.  That’s influenced parents.

Some states have pulled out of the Common Core.  “More than half of the 26 states that initially signed onto the PARCC exam in 2010 have dropped out,” notes McKenna. A dozen states will use the test this spring, while 17 states will take the rival SBAC. The rest will use their own tests.

How hard are Core math problems?

Math teachers in Maryland analyzed a Core-aligned fourth-grade math performance task from PARCC, reports Liana Heitin on Ed Week. Several were surprised at how much it required.

PARCC math item deer.JPG

Teachers listed what students need to know and be able to do to solve the problem:

The definitions of perimeter and area
How to find perimeter and area
The definition of a square mile
The properties of a rectangle
How to solve for an unknown in a perimeter
Multiplication (up to multi-digit)
Addition and subtraction (up to multi-digit)

Some might need division, depending on how they approached the problem.

And everyone will need reading and writing skills.

Students earn credit for finding the missing side length, for finding the area of the park, and for calculating the final number of deer. They also can get partial credit for each piece if they make minor calculation errors. That means the problem must be scored by a person, not a machine.

Here are some fifth-grade math questions released by New York. (Here are third- through eighth-grade questions for English and math.)

The next one involves the (gasp!) metric system.

Could I have solved these in fifth grade? I think so.

Core tests spark revolt

Common Core testing revolt is spreading across the nation, reports Politico.

The Obama administration put more than $370 million in federal funds into the PARCC and Smarter Balanced testing consortia. Forty states signed on — but at least 17 have backed out, including New York, Florida, Michigan and Pennsylvania. Louisiana, Missouri and New Jersey may go too.

Opposition is coming from all directions. Even Common Core supporters aren’t happy about the tests.

PARCC estimates its exams will take eight hours for an average third-grader and nearly 10 hours for high school students — not counting optional midyear assessments to make sure students and teachers are on track.

PARCC also plans to develop tests for kindergarten, first- and second- graders, instead of starting with third grade as is typical now. And it aims to test older students in 9th, 10th and 11th grades instead of just once during high school.

The new tests will cost more and the online exams will require states to “spend heavily on computers and broadband,” notes Politico.

Meanwhile, teachers in many states don’t know what sort of test their students will face.

In Michigan, second-grade teacher Julie Brill says she and her colleagues are expected to spend the coming year teaching Common Core standards — while preparing kids for a non-Common Core test that measures different skills entirely. “It’s just so crazy,” she said.

And in Florida, which broke with PARCC last year, third-grade teacher Mindy Grimes-Festge says she’s glad to be out of a Common Core test she believed was designed to make children fail — but she has only the most minimal information about the replacement exams.

“We’re going in blind,” Grimes-Festge said. “It’s like jumping from one frying pan to another. Just different cooks.”

Only 42 percent of students are slated to take PARCC or Smarter Balanced tests — and that’s certain to drop as more states go their own way.

Good riddance to Common Core testing, writes Diane Ravitch.

All accountability testing is at risk, writes Jay Greene. “The Unions are using Common Core not only to block new tests, but to eliminate high stakes testing altogether.”

6th graders seek pay for field-testing exams

After spending 5 1/2 hours field-testing new Common Core exams, Massachusetts sixth graders want to be paid for their time, reports the Ipswich Chronicle.

Ipswich Middle School teacher Alan Laroche’s A and B period math classes tested the Partnership for Assessment of Readiness for College and Careers exam. Students told Laroche that “PARCC is going to be making money from the test, so they should get paid as guinea pigs for helping them out in creating this test.”

Student Brett Beaulieu wrote a letter requesting $1,628. He calculated that’s minimum-wage compensation for 37 students for 330 minutes of work.

He then went on to figure out how many school supplies that amount could buy: 22 new Big Ideas MATH Common Core Student Edition Green textbooks or 8,689 Dixon Ticonderoga #2 pencils.

“Even better, this could buy our school 175,000 sheets of 8 ½” by 11″ paper, and 270 TI-108 calculators,” Beaulieu wrote.

He gathered over 50 signatures from students, as well as from assistant principal Kathy McMahon, principal David Fabrizio and Laroche.

Beaulieu and Laroche sent the letter to PARCC and to U.S. Education Secretary Arne Duncan, reports Reason’s Hit & Run. “Regardless of ideology, it seems like nobody—not teachers, not parents, not local officials, and certainly not sixth graders—likes being a guinea pig in an expensive national education experiment.”

Confused by Core tests

Kids have been field-testing new Common Core exams — and parents have been trying practice tests posted online. The verdict: The new tests are much harder — partly because of poorly worded questions.

Carol Lloyd, executive editor at GreatSchools, is a fan of the new standards, but worried about the test. She went online to try practice questions for both major common-core assessment consortia—Smarter Balanced and PARCC (the Partnership for Assessment of Readiness for College and Careers)—for her daughter’s grade.

Many of the questions were difficult but wonderful. Others were in need of a good editor.

A few, however, were flat-out wrong. One Smarter Balanced question asked students to finish an essay that began with a boy waking up and going down the hall to talk to his mother. Then, in the next paragraph, he’s suddenly jumping out of bed.

A PARCC reading-comprehension question asked students to pick a synonym for “constantly” out of five possible sentence options. I reread the sentences 10 times before I realized that no words or phrases in those sentences really meant “constantly,” but that the test-writer had confused “constantly” with “repeatedly.” Any student who really understood the language would be as confused as I was.

If these are the test questions they’re sharing with the public, “what are they doing in the privacy of my daughter’s test?” asks Lloyd.

Natalie Wexler, a writing tutor at a high-poverty D.C. high school, took the PARCC English Language Arts practice test for 10th-graders.  A number of questions were confusing, unrealistically difficult, or just plain wrong,” she writes.

Question 1 starts with a brief passage:

I was going to tell you that I thought I heard some cranes early this morning, before the sun came up. I tried to find them, but I wasn’t sure where their calls were coming from. They’re so loud and resonant, so it’s sometimes hard to tell.

Part A asked for the meaning of “resonant” as used in this passage:

A. intense B. distant C. familiar D. annoying

Looking at the context — it was hard to tell where the calls were coming from — Wexler chose “distant.”  The official correct answer was “intense.” Which is not what “resonant” means. 

Another passage described fireflies as “sketching their uncertain lines of light down close to the surface of the water.” What was implied by the phrase “uncertain lines of light.”

She chose: “The lines made by the fireflies are difficult to trace.” The correct answer? “The lines made by the fireflies are a trick played upon the eye.”

Wexler did better on a section where all the questions were based on excerpts from a majority and a dissenting opinion in a Supreme Court case about the First Amendment. “But then again, I have a law degree, and, having spent a year as a law clerk to a Supreme Court Justice, I have a lot of experience interpreting Supreme Court opinions,” she writes.

The average D.C. 10th grader won’t be able to demonstrate critical thinking skills, Wexler fears.

. . .  if a test-taker confronts a lot of unfamiliar concepts and vocabulary words, she’s unlikely to understand the text well enough to make any inferences. In just the first few paragraphs of the majority opinion, she’ll confront the words “nascent,” “undifferentiated,” and “apprehension.”

Most D.C. students “will either guess at the answers or just give up,” Wexler predicts.