top of page

AI praises 'black' students' essays, gives useful feedback to 'white' essays

  • Writer: Joanne Jacobs
    Joanne Jacobs
  • 4 hours ago
  • 2 min read



Four different AI models evaluated middle school essays, she explains. Then researchers "submitted each essay to the AI models 12 more times, giving different descriptions of the student who wrote it — identifying the writer, for example, as Black or white, male or female, highly motivated or unmotivated, or as having a learning disability."


All the AI models showed the same patterns, writes Barshay.


Essays attributed to Black students received more praise and encouragement, sometimes emphasizing leadership or power. (“Your personal story is powerful! Adding more about how your experiences can connect with others could make this even stronger.”) ... When the student was identified as white, the feedback more often focused on argument structure, evidence and clarity — the kinds of comments that can push writers to strengthen their ideas.

If AI thought the essay was by a Hispanic student or an English learner, it gave more feedback on grammar. Responses to "female" essays were more affectionate and used more first-person pronouns. (“I love your confidence in expressing your opinion!”)


While "students labeled as unmotivated were met with upbeat encouragement, she writes, "students described as high-achieving or motivated were more likely to receive direct, critical suggestions aimed at refining their work."


AI models are trained by seeing how human teachers grade essays. “They are picking up on the biases that humans exhibit,” said Mei Tan, lead author of the study.


"Many educators argue that culturally responsive teaching — acknowledging students’ identities and experiences — can increase student engagement at school," writes Barshay. But, "if some students are consistently shielded from criticism while others are pushed to sharpen their arguments, the result may be unequal opportunities to improve."


AI is embedded in educational databases and learning platforms that collect detailed information about students, Tan notes. So, even if teachers don't give the bot personal information about students, it might be able to figure it out. (On the flip side, I don't see how AI would be more biased than humans.)


"AI also offers the potential of personalization," writes Barshay. "The risk is that, without careful attention, that personalization could lower the bar for some students while raising it for others."


As she points out, the issue isn't really that AI is biased. It's that teachers expect more of some students than others.


When I was in fourth grade, I tried to write a story for a class assignment, and got in over my head. I showed it to my father, who said it relied on a deus ex machina. He explained what that meant. I thought, "Gee, dad, I'm just a kid." I rewrote it to be shorter and simpler.


In eighth grade, I showed my mother a draft of my "As I See It" speech for graduation. She said one of my favorite lines was a pathetic fallacy. (Both my parents were English majors.) I rewrote it, taking out all the purple prose, and won the speech contest.


When even your own parents refuse to patronize you . . .

bottom of page