Grammar fail: College kids can’t write

Novelist Michael Laser was hired to teach university freshmen to write essays. But he discovered his students can’t write clear, grammatically correct sentences.

How to write a college essay

Teaching writing to first-year students “has become an academic specialty with its own dominant philosophy,” Laser writes. Teaching “critical thinking” is in. Teaching grammar is out.

He’s supposed to stress “developing an arguable thesis, presenting strong supporting arguments, using quotations as evidence.”

But his students write so badly it hurts.

  • Neglecting to recognize the horrors those people endure allow people to go to war more easily.
  • The money in the household shared between Nora and Torvald contrast the idea of a happy marriage.
  • The similarities among the speakers and their author are illustrated differently through their speaker’s separate tones.

While teaching essay writing, Laser “added lessons on revising awkward phrases and replacing fuzzy abstractions with more concrete specifics.” Students could recognize the difference between bad and better writing, but struggled to revise “terrible sentences.”  He saw little improvement in their writing.

In the future, he plans to use John Maguire’s manual, which “stresses concrete nouns, active verbs, and conciseness.” But he’s looking for more advice on how to help his students.

I used to advise students to read their work aloud. Does it sound clunky? It is. If you were explaining this idea to a friend, what would you say?

Who grades Core essays? Not all are teachers


Essay graders are trained to be consistent like McDonald’s workers.

New Core-aligned tests rely on fewer multiple-choice questions and more writing, notes the New York Times. For example, elementary students might be asked to “read a passage from a novel written in the first person, and a poem written in the third person, and describe how the poem might change if it were written in the first person.”

Who’s grading essays on Common Core tests? Temps willing to work for $12 to $14 per hour. A college degree is required, but teaching experience is optional.

On Friday, in an unobtrusive office park northeast of downtown (San Antonio), about 100 temporary employees of the testing giant Pearson worked in diligent silence scoring thousands of short essays written by third- and fifth-grade students from across the country. There was a onetime wedding planner, a retired medical technologist and a former Pearson saleswoman with a master’s degree in marital counseling.

More than three-quarters of scorers have at least one year of teaching experience, according to PARCC, which developed one set of Core tests.

They’re trained to produce consistent scores — just like workers at Starbucks or McDonald’s, said Bob Sanders of Pearson. “McDonald’s has a process in place to make sure they put two patties on that Big Mac,” he continued. “We do that exact same thing.”

“Losers who can’t find real jobs” are grading tests, writes Eric Owens, Daily Caller‘s education editor.

NJ eyes automated test-grading

New Jersey is considering using robo-graders to evaluate essays on Common Core-aligned tests, reports NJ Advance Media.

Students will type short essays on the computerized Partnership for Assessment of Readiness for College and Careers (PARCC) exams. This year, they’ll be graded by humans — “but 10 percent of online essays will get a ‘second read’ by a computer to test the viability of automated scoring in the future.”

Computer grading is cheaper and returns scores quickly to students and their schools.

NYC: Cheating or sympathy?

“Scores on English Regents exams for high schoolers plummeted” when New York City barred teachers from grading tests given at their own school, reports the New York Daily News. Passing rates dropped at 373 out of 490 schools and the failure rate on English exams rose from 27 percent to 35 percent. That change was “not reflected in the other nine Regents subjects.”

At Harlem Renaissance High School 69% of students passed English in 2012. In 2013, only 37% passed. “Teachers helped us out a little bit. They gave us credit for trying,” said senior Morrell Christian, 19, recalling the good old days. “If you needed extra points they gave them to you. That changed when they couldn’t mark their own tests.

Evaluating essays is subjective, teachers told the Daily News. While “grade inflation was rampant,” it wasn’t cheating, they said.

“Teachers know their students. Sometimes a bad grade means the student giving you hell again next year, or him not getting a scholarship,” said one teacher at a Brooklyn school. “There’s a form of empathy coming out. Like, ‘Oh my God, there has to be another point in there! Let’s find it.’”

Many said teachers were “encouraged to grade the exams generously so more students would graduate.” That helped students, but raising graduation rates also could keep a school from closing and earn the principal a “fat bonus.”

Don’t blame measurements for cheating, writes Matt Yglesias on Vox. He’s responding to tweets by Chris Hayes, who “offers a take on the VA scandal that’s calculated to warm the hearts of America’s teachers unions,” writes Yglesias. Hayes writes:

Current VA story is a classic example how metrics ordered from above often just lead to books being cooked rather than better performance . . . See juking crime stats, Atlanta standardized test cheating scandal, etc…

Yglesias wonders if  “a person who cheats in response to an incentive program the kind of person who’s going to do amazing work in the absence of an incentive program . . . If a data-based framework is imperfect, is going to a data-free one any better?”

Babel vs. essay-grading bots

These days, more tests ask students to write short essays, not just answer multiple-choice questions. But it’s slow and expensive to hire humans do the grading. Essay-grading ‘bots are cheap and fast, but are they any good?

It’s easy to fool a robot grader, Les Perelman, a former writing director at MIT, tells the Chronicle of Higher Education.

His Basic Automatic B.S. Essay Language Generator, or Babel, can crank out robot-fooling essays using one to three keywords. Each sentence is grammatically correct, structurally sound and meaningless. Robots can’t tell the difference, says Perelman.

He fed in “privacy.” Babel wrote:

“Privateness has not been and undoubtedly never will be lauded, precarious, and decent. Humankind will always subjugate privateness.”

MY Access!, an online writing-instruction product, graded the essay immediately: 5.4 points out of 6, with “advanced” ratings for “focus and meaning” and “language use and style.”

Robots and human graders awarded similar scores to 22,000 essays by high school and middle school students, concluded at study by Mark D. Shermis, a former dean of the College of Education at the University of Akron, in 2012.

Perelman accused Shermis of bad data analysis. The Akron professor stood by his study and published a follow-up paper this year.

Computer scientists at edX, the nonprofit online-course provider co-founded by MIT, are working on the Enhanced AI Scoring Engine, or EASE. The software can learn and imitate the grading styles of particular professors,” reports the Chronicle.

Some of edX’s university partners have used EASE to provide feedback to students in massive open online courses (MOOCs).

Core in the classroom: Write and cite


Common Core has students writing and citing “textual evidence,” reports Sarah Carr for the Hechinger Report.

BELLE CHASSE, La. — In the early elementary school grades, Zachary Davis and his classmates at Belle Chasse Primary School in  suburban New Orleans wrote almost entirely from personal experience: describing their ideal vacation, trying to convince readers that a longer school year would be a good (or bad) idea, penning a letter about their adventures during summer break.

This year, as a fourth-grader, Zachary writes persuasive essays using “evidence” from nonfiction reading. For example, students “read a description of Louisiana’s Avery Island followed by one of a bayou swamp tour, and then wrote about which destination they would prefer to visit based on examples in the passages.” 

Proponents of the change say an increased emphasis on analytical, evidence-based assignments will better prepare students for the kind of writing they will face in college and the workforce, where few will be asked to describe family vacations or write poems, but they could very well be asked to summarize a research paper or defend a project proposal. Others worry that if schools veer too far in the direction of analytical writing at too young an age, they risk stifling children’s creativity and discouraging students who aren’t strong readers.

The “intense focus on text-based analysis is new,” said Shelley Ritz, principal of Belle Chasse Primary.

The school still teaches creative and narrative writing, but teachers expect new core-aligned tests will require students to write essays based on multiple reading passages. (The state’s transitional exam did just that.)

In keeping with the new standards, Belle Chasse teachers have gone to a 50-50 split between fiction and nonfiction readings. “Kindergarteners might read a non-fiction book about the life cycle of butterflies and moths paired with a fictional one featuring those insects as characters,” writes Carr.

In Zachary’s class, students practiced writing essays for the state exam, but protested when they learned they’d be doing more writing in social studies and science. 

The class had just finished a citizenship unit where they learned how citizens of all ages can contribute ideas to improve their communities. So the students said they wanted to write a letter to Gov. Bobby Jindal protesting all the writing required in Louisiana’s public schools these days.

Teacher Mary Beth Newchurch agreed. After all, it was another chance to practice writing.

Prof: Don’t require college essays

Stop requiring college students to write essays, argues an adjunct who’s sick of grading poorly written and plagiarized papers.

This is the “Anyway Argument,” also used to justify dropping college algebra requirements, writes a community college dean. Most students won’t use it anyway, so why bother?

Are core tests written for robo-readers?

Sacramento teacher Alice Mercer questions the Common Core tests her students will be taking.  A sample essay prompt by the Smarter Balanced Assessment Consortium gives students a list of arguments to use.
sbac1.jpg

The new standards are supposed to promote higher-order thinking. So why not let students think up their own arguments? Evaluating students’ writing is time-consuming and expensive — unless it can be automated, Mercer points out. Listing the arguments seems designed for the convenience of a robo-reader.

Robo-readers have limitations, she observes. It’s hard for computers to score open-ended questions.

Basically, the programs can judge grammar and usage errors (although I suspect it will lead to a very stilted form of writing that only a computer could love), but it’s not in the position to judge the facts and assertions, or content in an essay.  The only way to do that is to limit students to what “facts” they are using by giving them a list.

Computer grading could explain Common Core’s hostility to background knowledge, Mercer adds.

Computer-scored tests train children to think like the computer,  writes Anthony Cody in Ed Week. “If we are sacrificing intelligence, creativity and critical thinking for the sake of the efficiency and standardization provided by a computer, this seems a very poor trade.”

To get into college, fake it

Applying to College Shouldn’t Require Answering Life’s Great Questions, writes Julia Ryan in The Atlantic. Elite colleges’ admissions essay prompts pretty much demand that students “pretend to be something you are not,” she charges.

Brown University is asking applicants for the Class of 2017: French novelist Anatole France wrote: “An education isn’t how much you have committed to memory, or even how much you know. It’s being able to differentiate between what you do know and what you don’t.” What don’t you know?

The University of Chicago would like high-school seniors to tell them: How are apples and oranges supposed to be compared? Possible answers involve, but are not limited to, statistics, chemistry, physics, linguistics, and philosophy.

Tufts would simply like to know: What makes you happy?

“Applying to college shouldn’t be the intellectual equivalent of dressing up in your mother’s clothes,” writes Ryan.

Many of her commenters liked the prompts. (They made me very glad that all this is behind me.)

Universities have automated admissions, writes a commenter who designs admissions software. An outside service will use “advanced OCR and ICR recognition software plus semantic analysis” to turn the transcript and extracurriculars into a single number. Essays are turn through plagiarism software. “If a university is particularly prestigious they *might* read the essay, but the counselor is reading about 15 to 20 an hour.” The essay reader is probably an untrained graduate student or unemployed graduate making $11 to $13 an hour, he writes.

Hacking the Common App has good advice on writing admissions essays. Here’s part 1 and part 2.

Bard’s new admissions option — submit four research papers instead of grades and scores — is begging to be gamed by the wealthy, writes Jordan Weissmann.

Rather than submit a full battery of grades, teacher recs, SAT scores, and personal essays, Bard applicants will be able to choose to hand in four 2,500 word research papers, which will be graded by faculty. Applicants who earn a B+ or better on their writing will be accepted . . .

“It’s kind of declaring war on the whole rigmarole of college admissions and the failure to foreground the curriculum and learning,” Leon Botstein, Bard’s president of 38 years, said in an interview.

Who’d choose this option? Someone who’s gone to a very good college-prep high school and learned to write a college-quality research paper, but hasn’t earned Bard-worthy grades or test scores. That’s a small group. Or, as Weissmann suggests, someone who can afford to pay a “college consultant” to write the papers.

Shakespeare or Stein?

Instead of reading Shakespeare, students of the future will analyze the writing of Joel Stein, writes Joel Stein in Time. It makes him nervous. Common Core State Standards will shift reading lists to non-fiction, Stein writes. By reading analytical essays, they’ll learn to write analytical essays — instead of journal entries about their feelings.

Stein reads Faulkner or Joyce to improve his writing. CCSS urges students to dip into FedViews by the Federal Reserve of San Francisco.” Which is not quite the same.

Fiction also teaches you how to tell a story, which is how we express and remember nearly everything. If you can’t tell a story, you will never, ever get people to wire you the funds you need to pay the fees to get your Nigerian inheritance out of the bank.

Education isn’t just training for work, Stein writes. “It’s training to communicate throughout our lives.”

If we didn’t all experience Hamlet’s soliloquy, we’d have to explain soul-tortured indecisiveness by saying things like “Dude, you are like Ben Bernanke in early 2012 weighing inflation vs. growth in Quantitative Easing 3.”

Teaching language through nonfiction is like teaching history by playing Billy Joel’s “We Didn’t Start the Fire” or teaching science by giving someone an unmarked test tube full of sludge and having him figure out if the white powder he distilled is salt or sugar by making Steven Baumgarten taste it, which is how I learned science and how Steven Baumgarten learned to be more careful about picking people to work with.

That’s “something he could have learned by reading Othello,” Stein concludes.