Humans communicate with machines every day and work is always being done to make interfaces more intuitive, but purely in regard to mechanical aspects.

There's no question that the machines are in control - you do it in a way they understand or you are stuck. Moore's Law has been in effect for processors and raw horsepower but interface advancements and understanding are still trapped in the 1960s.

The Humaine project wants to change that by bringing together specialists and scholars from very different disciplines to create the building blocks and tools needed to give machines so-called ‘soft’ skills, like understanding emotions.

Professor Roddy Cowie, coordinator of the EU-funded project, says, "When they developed databases, the recordings were nothing like the way emotion appears in everyday action and interaction, and the codes they used to describe the recording would not fit the things that happen in everyday life,” explains Cowie.

Humaine took a different tack and, instead of assigning programmers and engineers the job in solitutde, set up teams from disciplines as different as philosophy, psychology and computer animation.

The psychologists studied and interpreted the signals people give out, signifying different emotional states from boredom through to rage. Part of this is simply what is being said, but there is also the tone in which it is being said, the expression on the face, and smaller signals like eye gaze, hand gestures and posture.

Put all of these together and it is then possible for the psychologists and IT professionals to work together on a database which allows the interpretation of, and reaction to, emotion.

“Then the people who know about communications feed information to people whose job it is to get computers to generate sophisticated images,” says Cowie.

The End Of The Nice, But Dumb, Avatar

In trials in Scotland and Israel, museum guides, in the form of handheld PDAs with earpieces and microphones, monitor visitors’ levels of interest in different types of display and react accordingly. “While this is still at a basic level, it is a big step up from a simple recorded message,” Cowie points out.

At another museum in Germany, a large avatar called Max spices up the presentation by interacting with children. “Max is not very deep, but he is very entertaining, and he engages the kids,” according to Cowie.

Designers have also used the techniques to monitor the emotions of people playing video games and improve the design accordingly. Possible applications include learner-centred teaching, where students’ interest levels can be monitored and responded to, and more user-friendly manuals for, say, installing computer software.

“People automatically assume the work is aimed towards full interaction between humans and machines, rather like HAL from 2001: A Space Odyssey,” says Cowie. “That may never happen. Humaine’s philosophers have thought through carefully whether we should allow it to,” he adds. Even if it does go that way, it is certainly not any time soon, he notes.

But the path to emotional machines is being paved today. Cowie and his colleagues have already set up a new project to tie the threads together and come up with an agent which can truly interact using voice. Here, new advances in speech recognition technology from other projects will be necessary for full interaction.

This is a simplistic explanation of a highly complex project which might not come to full fruition for another 20 or 30 years, although there are already concrete results and applications of some of the technological threads the project has come up with.

“We’ve developed systems for recognising emotion using multiple modalities and this puts us very much at the leading edge of recognition technology,” says Cowie. “And we’ve identified the different types of signal which need to be given by an agent – normally a screen representation of a person – if it is going to react in an emotionally convincing way.”

In the meantime, plenty of other applications will present themselves. “As our interactions with machines get more and more pervasive, it becomes harder and harder to ignore the emotional element. Taking it into account will become a routine part of computer science courses and computer development,” Cowie concludes.

Further reading: Emotional Machines, Cordis