Writing machine

Essay-scoring software helps students learn basic writing skills, says the Santa Cruz Sentinel. In addition to feedback on spelling and grammar errors, the program points out “passive verbs, sentence fragments and overly repetitive words.” By comparing essays with human-scored essays on the same topic, the software can generate a grade and an analysis of writing errors in less than a minute.

In one high school, 79 percent of sophomores who used the program passed the English portion of the graduation exam, compared to 51 percent of classmates. Students with poor English fluency, who used the software to practice writing at home, posted an 82 percent pass rate.

“A dome of silence descends on the classroom,” (teacher C.J.) Foss said. “All you hear is the clicking of the keys. They’re playing the game of writing. They click ‘submit,’ they get their score, and they play again. They’re in a zone.”

Pat Thornton, 56, who teaches at Lakeside Middle School in Irving, appreciates being able to give her 182 students more feedback.

She assigns an essay a week for practice. During the course of a year, her students produce 10 in-depth essays for her to grade.

“The more students write, the more feedback they get, the more they improve,” she said.

On the state writing test, where 8 is the top score, 85 percent of the seventh-graders scored 5 or higher.

The key seems to be that students do a lot more writing when the teacher doesn’t have to take time to grade every essay.

One essay-grading software costs $6.50 per student per year. Another, with more features, costs $36 per student per year.

Via Education News.

About Joanne


  1. “being able to give her 182 students more feedback”

    why is it so hard to see the forest for the trees?

  2. Bluemount says:

    Using software instead of professionals is an ineffective use of the software. Especially bilingual students should not rely on software. It is unlikely they will learn to expand their vocabulary and speak correctly if the essays are not part of an interactive classroom experience guided by a professional.

    The points I do like about software is the feedback is immmediate, it is non-critical and the student is motivated to scrutinize their work. I used Corel Grammer check when my son in elementary school and thought it had some good features. For example it was very good at catching sentences that had too many ‘ands’ and ‘buts’. It also worked by forcing him to write short sentences and the discription of some errors was to difficult for him to understand. If the software was used as part of an program that was already effective it could have some benefit. I especially dislike one version that would not allow him to use the word ‘girl’ (it forced a substitution of young lady), but did allow the word ‘boy’.

  3. This software might be good for immediate feedback on grammar, assuming it’s much, much better than the grammar checker in Microsoft word, and that it doesn’t let political correctness override good writing. However, I certainly hope a human is reading and grading the final essays. It’s quite possible to write something that is grammatically correct but meaningless or self-contradictory. AI is a very long way from understanding meaning at all, and far too much bafflegab gets written without giving people a computer program that might give it a perfect score.

    I’m also dubious about the part that compares the student essays to a database of essays on the same topic. It sounds like true originality is not allowed.

  4. My initial reaction to this was very negative: as markm says, AI is a *very long way* from understanding meaning…and I don’t think you can really evaluate writing without understanding meaning. However, I can see how the “game” aspect would really capture the attention of students. Perhaps this is something that would be useful for practice…but shouldn’t be used for assignment of final grades.

    An analogy: the early generations of aircraft simulators (Link trainer, etc) were useful for gaining experience in instrument flight, but there was no way you were going to get an instrument rating based purely on Link-trainer experience.

    And I bet that the Link trainer came much closer to simulating real aircraft dynamics than any present-day software does to simulating an intelligent reader.

  5. Walter E. Wallis says:

    Sounds like a step in the right direction. Ideally every student would write essays in flowery Spencerian script, but sometimes you take what you can get.

  6. I’m pretty sure the software is used to give students lots of practice with writing, not to produce a final grade. I don’t think anyone would argue a program is as good as a teacher’s evaluation.

  7. With regards to originality, I think there has been too much of it too early for a lot of young writers. Asking young writers to be original before they have a firm grounding in mechanics is not a good thing. Once the young writer can write in a mechanically correct manner, then he can start to find his own voice.