Teaching Machines to Feel: Part 1

By Jing-Han Ong

You - “Why you angry at me?”
Them - “No I’m not. I’m not angry at all.”

I’m sure that is a scene familiar to many of us. Maybe it’s with a family member, or a partner. How did you know they were upset in this situation? They said they weren't...

Well, you read the emotion. You realised that the ‘rational’ interpretation of their words needs to be flipped once you take into account the emotional cues that were being given.  

In a way, all decision making is coloured by emotion. Much academic research has shown that many ‘decisions’ are in-fact post-rationalisation of subconscious reaction. Think about your favourite movie. Your general takeaway is "it is fantastic". Only when probed, do you try to make sense of your feelings. Why did you like it?

 

This is one reason why movie and book recommendations really struggle. For one, how much you like a show really depends on your mood that day. Some days you want Inception. On others, you want Legally Blonde. Also, the same movie with the same plot can take you on completely different emotional journeys depending on editorial style. The Usual Suspects works so well because the twist at the end catches everyone by surprise.

Which is why it’s really surprising to me that current discourse on artificial intelligence hasn’t really touched on emotional empathy.

'Artificial Intelligence’ is a fairly ambiguous term, but we can generally understand it as the development of computer systems able to perform tasks traditionally associated with human capabilities such as problem solving, visual perception or speech recognition. In fact, most of these technologies are built off human cognitive models, such as neural networks. Essentially, we are trying to give machines the ability to think. But the logical progression of whether machines can ‘feel’ hasn’t really been discussed.

Computer decision models struggle because ultimately humans are still governed by emotional impulses. If a computer doesn’t account for emotions, it will never understand why a species capable of accessing all the information in the world at their fingertips only asks for things like cat videos.

The best butlers know what to give you before you even know you wanted it. If we want to advance the intuitiveness of machine intelligence, we need to make them capable of emotional empathy. To understand why humans are so terrible at logic. Why our delights are so primitive.

Let’s take a step back now, and look at the two aspects of empathy development in children.

  1.  Learning how to identify what other people are feeling. In other words, the ability to detect displayed emotional cues.

  2.  The ability to act appropriately in a social situation according to other people’s feelings

empathybabies.jpg

This can probably be summarised as emotional detection and reaction.

Children have to learn this. Some are better at it, some take a little while longer. I was always in that second group.

I remember when I was 6, another little girl was crying. I was really confused about why she was making so much noise. It was only when I saw all the other children rushing towards her, then I realised “Ah, that’s a sign of distress”. And also you are meant to be comforting in these sort of scenarios.

It’s funny, some people just know what to do or say in the right situations. For me, I have to take a pause and mimic what the adults do. It’s how I learnt how to cry at funerals.

As adults, these two processes usually happen simultaneously and so quickly that you probably never think of them as distinct steps. In fact, every day you probably detect and react to hundreds of emotional behaviours without even realising. Reading body language is integral to how we navigate social situations.

On a very rudimentary level, CrowdEmotion is mimicking that process. We are teaching machines empathy by giving them the capability to detect and react to emotions.

Thanks to thousands of volunteers who sent us videos of their faces, we have built a system that has learnt how to track facial muscle movement. Via a webcam, our software learns the edges that make up a human’s face. Meaning the computer first has to recognise what is a face, see where those green dots are. And what is just the surrounding.

facedot.png

Simultaneously, advances in computer vision means we are now able to map eye-pupil movement to an x-y coordinate on the screen.

eyetrack.png

This means that the computer is able to track gaze attention. It knows exactly what you are looking at, how long you spent looking at something. As more data is fed into the system, the AI’s biometric identification capabilities will improve over time.

So from the above, we have a machine that has attained 'Stage 1': Detecting how people feel. We are currently working on 'Stage 2': Having the machine react to that information.

Maybe one day it will be able to give us movies based on emotion recommendations!


Interested in learning more?