NYC: Cheating or sympathy?

“Scores on English Regents exams for high schoolers plummeted” when New York City barred teachers from grading tests given at their own school, reports the New York Daily News. Passing rates dropped at 373 out of 490 schools and the failure rate on English exams rose from 27 percent to 35 percent. That change was “not reflected in the other nine Regents subjects.”

At Harlem Renaissance High School 69% of students passed English in 2012. In 2013, only 37% passed. “Teachers helped us out a little bit. They gave us credit for trying,” said senior Morrell Christian, 19, recalling the good old days. “If you needed extra points they gave them to you. That changed when they couldn’t mark their own tests.

Evaluating essays is subjective, teachers told the Daily News. While “grade inflation was rampant,” it wasn’t cheating, they said.

“Teachers know their students. Sometimes a bad grade means the student giving you hell again next year, or him not getting a scholarship,” said one teacher at a Brooklyn school. “There’s a form of empathy coming out. Like, ‘Oh my God, there has to be another point in there! Let’s find it.’”

Many said teachers were “encouraged to grade the exams generously so more students would graduate.” That helped students, but raising graduation rates also could keep a school from closing and earn the principal a “fat bonus.”

Don’t blame measurements for cheating, writes Matt Yglesias on Vox. He’s responding to tweets by Chris Hayes, who “offers a take on the VA scandal that’s calculated to warm the hearts of America’s teachers unions,” writes Yglesias. Hayes writes:

Current VA story is a classic example how metrics ordered from above often just lead to books being cooked rather than better performance . . . See juking crime stats, Atlanta standardized test cheating scandal, etc…

Yglesias wonders if  “a person who cheats in response to an incentive program the kind of person who’s going to do amazing work in the absence of an incentive program . . . If a data-based framework is imperfect, is going to a data-free one any better?”

Babel vs. essay-grading bots

These days, more tests ask students to write short essays, not just answer multiple-choice questions. But it’s slow and expensive to hire humans do the grading. Essay-grading ‘bots are cheap and fast, but are they any good?

It’s easy to fool a robot grader, Les Perelman, a former writing director at MIT, tells the Chronicle of Higher Education.

His Basic Automatic B.S. Essay Language Generator, or Babel, can crank out robot-fooling essays using one to three keywords. Each sentence is grammatically correct, structurally sound and meaningless. Robots can’t tell the difference, says Perelman.

He fed in “privacy.” Babel wrote:

“Privateness has not been and undoubtedly never will be lauded, precarious, and decent. Humankind will always subjugate privateness.”

MY Access!, an online writing-instruction product, graded the essay immediately: 5.4 points out of 6, with “advanced” ratings for “focus and meaning” and “language use and style.”

Robots and human graders awarded similar scores to 22,000 essays by high school and middle school students, concluded at study by Mark D. Shermis, a former dean of the College of Education at the University of Akron, in 2012.

Perelman accused Shermis of bad data analysis. The Akron professor stood by his study and published a follow-up paper this year.

Computer scientists at edX, the nonprofit online-course provider co-founded by MIT, are working on the Enhanced AI Scoring Engine, or EASE. The software can learn and imitate the grading styles of particular professors,” reports the Chronicle.

Some of edX’s university partners have used EASE to provide feedback to students in massive open online courses (MOOCs).

Core in the classroom: Write and cite


Common Core has students writing and citing “textual evidence,” reports Sarah Carr for the Hechinger Report.

BELLE CHASSE, La. — In the early elementary school grades, Zachary Davis and his classmates at Belle Chasse Primary School in  suburban New Orleans wrote almost entirely from personal experience: describing their ideal vacation, trying to convince readers that a longer school year would be a good (or bad) idea, penning a letter about their adventures during summer break.

This year, as a fourth-grader, Zachary writes persuasive essays using “evidence” from nonfiction reading. For example, students “read a description of Louisiana’s Avery Island followed by one of a bayou swamp tour, and then wrote about which destination they would prefer to visit based on examples in the passages.” 

Proponents of the change say an increased emphasis on analytical, evidence-based assignments will better prepare students for the kind of writing they will face in college and the workforce, where few will be asked to describe family vacations or write poems, but they could very well be asked to summarize a research paper or defend a project proposal. Others worry that if schools veer too far in the direction of analytical writing at too young an age, they risk stifling children’s creativity and discouraging students who aren’t strong readers.

The “intense focus on text-based analysis is new,” said Shelley Ritz, principal of Belle Chasse Primary.

The school still teaches creative and narrative writing, but teachers expect new core-aligned tests will require students to write essays based on multiple reading passages. (The state’s transitional exam did just that.)

In keeping with the new standards, Belle Chasse teachers have gone to a 50-50 split between fiction and nonfiction readings. “Kindergarteners might read a non-fiction book about the life cycle of butterflies and moths paired with a fictional one featuring those insects as characters,” writes Carr.

In Zachary’s class, students practiced writing essays for the state exam, but protested when they learned they’d be doing more writing in social studies and science. 

The class had just finished a citizenship unit where they learned how citizens of all ages can contribute ideas to improve their communities. So the students said they wanted to write a letter to Gov. Bobby Jindal protesting all the writing required in Louisiana’s public schools these days.

Teacher Mary Beth Newchurch agreed. After all, it was another chance to practice writing.

Prof: Don’t require college essays

Stop requiring college students to write essays, argues an adjunct who’s sick of grading poorly written and plagiarized papers.

This is the “Anyway Argument,” also used to justify dropping college algebra requirements, writes a community college dean. Most students won’t use it anyway, so why bother?

Are core tests written for robo-readers?

Sacramento teacher Alice Mercer questions the Common Core tests her students will be taking.  A sample essay prompt by the Smarter Balanced Assessment Consortium gives students a list of arguments to use.
sbac1.jpg

The new standards are supposed to promote higher-order thinking. So why not let students think up their own arguments? Evaluating students’ writing is time-consuming and expensive — unless it can be automated, Mercer points out. Listing the arguments seems designed for the convenience of a robo-reader.

Robo-readers have limitations, she observes. It’s hard for computers to score open-ended questions.

Basically, the programs can judge grammar and usage errors (although I suspect it will lead to a very stilted form of writing that only a computer could love), but it’s not in the position to judge the facts and assertions, or content in an essay.  The only way to do that is to limit students to what “facts” they are using by giving them a list.

Computer grading could explain Common Core’s hostility to background knowledge, Mercer adds.

Computer-scored tests train children to think like the computer,  writes Anthony Cody in Ed Week. “If we are sacrificing intelligence, creativity and critical thinking for the sake of the efficiency and standardization provided by a computer, this seems a very poor trade.”

To get into college, fake it

Applying to College Shouldn’t Require Answering Life’s Great Questions, writes Julia Ryan in The Atlantic. Elite colleges’ admissions essay prompts pretty much demand that students “pretend to be something you are not,” she charges.

Brown University is asking applicants for the Class of 2017: French novelist Anatole France wrote: “An education isn’t how much you have committed to memory, or even how much you know. It’s being able to differentiate between what you do know and what you don’t.” What don’t you know?

The University of Chicago would like high-school seniors to tell them: How are apples and oranges supposed to be compared? Possible answers involve, but are not limited to, statistics, chemistry, physics, linguistics, and philosophy.

Tufts would simply like to know: What makes you happy?

“Applying to college shouldn’t be the intellectual equivalent of dressing up in your mother’s clothes,” writes Ryan.

Many of her commenters liked the prompts. (They made me very glad that all this is behind me.)

Universities have automated admissions, writes a commenter who designs admissions software. An outside service will use “advanced OCR and ICR recognition software plus semantic analysis” to turn the transcript and extracurriculars into a single number. Essays are turn through plagiarism software. “If a university is particularly prestigious they *might* read the essay, but the counselor is reading about 15 to 20 an hour.” The essay reader is probably an untrained graduate student or unemployed graduate making $11 to $13 an hour, he writes.

Hacking the Common App has good advice on writing admissions essays. Here’s part 1 and part 2.

Bard’s new admissions option — submit four research papers instead of grades and scores — is begging to be gamed by the wealthy, writes Jordan Weissmann.

Rather than submit a full battery of grades, teacher recs, SAT scores, and personal essays, Bard applicants will be able to choose to hand in four 2,500 word research papers, which will be graded by faculty. Applicants who earn a B+ or better on their writing will be accepted . . .

“It’s kind of declaring war on the whole rigmarole of college admissions and the failure to foreground the curriculum and learning,” Leon Botstein, Bard’s president of 38 years, said in an interview.

Who’d choose this option? Someone who’s gone to a very good college-prep high school and learned to write a college-quality research paper, but hasn’t earned Bard-worthy grades or test scores. That’s a small group. Or, as Weissmann suggests, someone who can afford to pay a “college consultant” to write the papers.

Shakespeare or Stein?

Instead of reading Shakespeare, students of the future will analyze the writing of Joel Stein, writes Joel Stein in Time. It makes him nervous. Common Core State Standards will shift reading lists to non-fiction, Stein writes. By reading analytical essays, they’ll learn to write analytical essays — instead of journal entries about their feelings.

Stein reads Faulkner or Joyce to improve his writing. CCSS urges students to dip into FedViews by the Federal Reserve of San Francisco.” Which is not quite the same.

Fiction also teaches you how to tell a story, which is how we express and remember nearly everything. If you can’t tell a story, you will never, ever get people to wire you the funds you need to pay the fees to get your Nigerian inheritance out of the bank.

Education isn’t just training for work, Stein writes. “It’s training to communicate throughout our lives.”

If we didn’t all experience Hamlet’s soliloquy, we’d have to explain soul-tortured indecisiveness by saying things like “Dude, you are like Ben Bernanke in early 2012 weighing inflation vs. growth in Quantitative Easing 3.”

Teaching language through nonfiction is like teaching history by playing Billy Joel’s “We Didn’t Start the Fire” or teaching science by giving someone an unmarked test tube full of sludge and having him figure out if the white powder he distilled is salt or sugar by making Steven Baumgarten taste it, which is how I learned science and how Steven Baumgarten learned to be more careful about picking people to work with.

That’s “something he could have learned by reading Othello,” Stein concludes.

Tutors or cheaters?

Wealthy parents are hiring “tutors” to do their children’s work through private school — and sometimes college, reports the New York Post. Eager to get their kids into elite colleges by any means necessary, parents go online to find “legit and not-so-legit tutors, homework helpers and ghostwriters.”

“Charles” put himself through medical school and put a down payment on an apartment with $150,000 he earned over six years of ghostwriting for a single student.

The mother — a college professor — demanded Charles “tutor” her 15-year-old sophomore son by completing every homework assignment and writing every paper and college essay. . . .

Once the boy was off to his out-of-state private university, he flunked out after less than one year without the coddling of a tutor.

. . . And when the student was enrolled at a less-competitive school back in New York, Charles was pulled back in at the mother’s urging: “I was back in the picture in the same way as before: coming over five or six days a week. They paid for my apartment,” he says.

Teachers notice when mediocre students turn in “grad-school-like” papers, a private school teacher tells the Post.

“We would have staff meetings to discuss tutors: How do we grade this essay, knowing a tutor is crafting it? It puts teachers in an awkward position, because you don’t want to accuse the kid. Teachers can’t keep up with all the ways kids are cheating these days.”

It sounds as though private schools don’t want to confront parents who are paying the tuition bill as well as the ghost-writer’s bill.

College admissions officers also see a lot of ghost-written or mom-written essays. I wonder if there’s any point in requiring an essay.

NAEP: 27% of students write proficiently

Students in eighth and 12th grade write just as poorly on laptops as they do with paper and pencil, concludes the new National Assessment of Educational Progress writing exam. In both grades, 27 percent of students were rated proficient or better.

Students were given “two 30-minute writing prompts that asked them to persuade, explain, or convey experiences,” reports Education Week.

At the 8th grade level, for example, one exercise called “Lost Island” asked students to imagine they had arrived on a remote island and listen to an audio file that included nature sounds and lines of a journal read aloud. Students then were required to write personal stories that chronicled an experience they would have had on the island, had they been there.

To reach “advanced” on the exam, students told well-organized stories with strong details, precise word choices, and varied sentences, according to the NAEP report. Students at the “basic” level would use some detail in their stories, but organization was “loose,” sentence structure unvaried, and word choice limited.

Students who were required by teachers to use computers more often to write and edit assignments performed better on the test, NAEP reported. Most students used spell check, but only 20 percent used the cut and paste functions on the laptops.

Girls did much better than boys. The racial breakdown was . . . The usual. I’ll just note that Asian-American students, many of whom speak English as a second language, outscored whites.

 

Teachers can learn from tests

Once a foe of standardized testing, Ama Nyamekye improved her teaching by analyzing her students’ scores on New York’s Regents exam, she writes in Ed Week.  When she asked her sophomores to take the English Regents exam a year early, she discovered “holes in my curriculum.”

I once dismissed standardized testing for its narrow focus on a discrete set of skills, but I learned that my self-made assignments were more problematic. It turned out they were skewed in my favor. I was better at teaching literary analysis than grammar and punctuation. When I started giving ongoing standardized assessments, I noticed that my students showed steady growth in literary analysis, but less growth in grammar and punctuation. I was teaching to my strengths instead of strengthening my weaknesses.

Grading is subjective, she writes. Emotionally invested in her students’ success — and implicitly judging her own effectiveness — she was quick to see signs of achievement.

By contrast, her students’ Regents essays were graded by English teachers who didn’t know them and who used detailed rubrics.

When I “depoliticized” the test, I found a useful and flawed ally. The exam excelled where I struggled, offering comprehensive and standards-based assessments. I thrived where the test fell short, designing creative, performance-based projects. Together, we were strategic partners. I designed and graded innovative projects—my students participated in court trials for Shakespearean characters—and the test provided a rubric that guided my evaluation of student learning.

All her students who took the exam passed it. Most earned high scores.