According to a new study, that could lead to empathetic robots and reactive online learning programmes, computers are able to read a person’s body language to tell whether they are bored or interested in what they see on the screen.
Body-language expert Dr Harry Witchel from University of Sussex found that by measuring a person’s movements as they use a computer, it is possible to judge their level of interest. For example by monitoring whether they display the tiny movements that people usually constantly exhibit, known as non-instrumental movements.
The study showed that when someone is really highly engaged in what they’re doing, they suppress these tiny involuntary movements. It’s the same as when a small child, who is normally constantly on the go, stares gaping at cartoons on the television without moving a muscle.
Hence, this study will have a significant impact on the development of artificial intelligence (AI).
“Being able to ‘read’ a person’s interest in a computer program could bring real benefits to future digital learning, making it a much more two-way process,” the author added.
Future applications could include the creation of online tutoring programmes that adapt to a person’s level of interest, in order to re-engage them if they are showing signs of boredom.
It could even help in the development of companion robots, which would be better able to estimate a person’s state of mind. Also, for experience designers such as movie directors or game makers, this technology could provide complementary moment-by-moment reading of whether the events on the screen are interesting.