Clear writing requires clear thinking. If bots do the writing, will students still do the thinking?
New software is very sophisticated, writes John Symons, who teaches writing-intensive philosophy courses, on Return. He worries that artificial intelligence will "make us less intelligent."
"Learning to write has helped generations of students to improve their ability to think," he writes. Grades in humanities classes depend heavily on the quality of written work.
Jasper AI is an AI writing bot, and Symons expects professors to see bot-written essays this fall.
I tried giving some of these systems standard topics that one might assign in an introductory ethics course and the results were similar to the kind of work that I would expect from first-year college students. . . . What it said about Kant’s categorical imperative or Mill’s utilitarianism, for example, was accurate. And a discussion of weaknesses in Rawlsian liberalism generated by GPT-3 was stunningly good.
"After a little editing, GPT-3 produced a copy that would receive at least a B+ in one of our large introductory ethics lecture courses," Symons writes.
He tried "quirky, idiosyncratic assignments." The system produced work that might get a C+ or B-.
A plagiarism detector showed no issues.
In short, writes Symons, professors will not be able to center humanities courses on research and writing.
Should we concentrate on handwritten in-class assignments? Should we design more sophisticated writing projects? Multiple drafts?
He is not optimistic.
I wonder what percentage of college students are there to learn, not just to qualify for a better job?
This is just plagiarism - the AI 'writing' algorithms he's so impressed by are simply recomposing texts found online. That's why they resemble the work of 1st year students so closely - they're both cribbing from the same materials.
What's actually improved is the structural and semantic analysis of texts. That enables them to be parsed down to the level of individual sentences and then described, categorized and organized on that basis. This then makes it possible to compose 'original' compositions on a given subject by drawing from multiple corresponding examples.
Frankly the results aren't that impressive. I have the sense that people writing on this subject, who intend to warn us of the ramifications, are exaggerating their response to…
Here's a shorter (1:43) video on the relationship of writing to learning to think. Sadly, most writing assigned in school is organized to impede thinking.
https://youtu.be/EXYzO95XU20
It seems to me that philosophy classes should be more about the reading (and class discussion) than anything else. Writing is an important skill, but if an AI can do it, I don't know what we can do about it. When electric lights took over, I'm sure there were people complaining that the electric light was too harsh and lacked the charm of nice warm candle or lamp light. They got over it or died out.
Maybe we've reached peak human...
This is backwards. The only useful skills the humanities/liberal arts ever gave students was the ability to write and speak well. The content, which professors loved, was just the conduit. To learn to think well, you need deep, difficult ideas. Those came from the Western canon. And that content had the added benefit of a long, well-vetted body of what other great thinkers thought of the content.
Writing teaches disciplined thinking, but before you can think, you need free and open discussion. As long as schools impose speech restrictions, that school cannot create people who can think well or become educated, excepting those who learn elsewhere.
Jordan Peterson lays out how writing trains thinking. And thinking comes from talking t…
How will leaders and managers learn to clearly communicate their intent to subordinates orally or in writing? How will our leaders write clear laws and effectively debate and analyze issues? AI will not help