Confused? Your computer can sense it

Computers can monitor students’ facial expressions and evaluate their engagement or frustration, according to North Carolina State researchers. That could help teachers track students’ understanding in real time, notes MIT Technology Review.

Perhaps it could even help massively open online courses (or MOOCs), which can involve many thousands of students working remotely, to be more attuned to students’ needs.

It also hints at what could prove to be a broader revolution in the application of emotion-sensing technology. Computers and other devices that identify and respond to emotion—a field of research known as “affective computing”—are starting to emerge from academia. They sense emotion in various ways; some measure skin conductance, while others assess voice tone or facial expressions.

The NC State experiment involved college students who were using JavaTutor software to learn to write code. The monitoring software’s conclusions about students’ state of mind matched their self reports closely.

“Udacity and Coursera have on the order of a million students, and I imagine some fraction of them could be persuaded to turn their webcams on,” says Jacob Whitehill, who works at Emotient, a startup exploring commercial uses of affective computing. “I think you would learn a lot about what parts of a lecture are working and what parts are not, and where students are getting confused.”

About Joanne


  1. Michael E. Lopez says:

    I’m going to go ahead and say it — this is a silly idea. A stupid idea. A waste of time and energy as far as online courses and general educational technology are concerned. It’s also a little creepy.


    …it has non-trivial applications way down the line that are likely to fundamentally change the roles of machines in society.

    The hardest part is that machines are only ever going to be able to read emotions “on the average,” whereas the subjective experience of average biological states can vary tremendously from person to person.

    There will have to be a great deal of individual calibration that goes on — machines will have to “learn” how to read specific people.

    • Roger Sweeny says:

      At least as silly as the idea that a computer can hear what you are saying and write it down without you even touching the keyboard.