The computer of the future knows how you feel

Imagine a difficult mathematical problem. You start sweating at the formulas. You frown a little.

A camera detects the changing position of your eyebrows. A computer with artificial intelligence analyzes your face.

Not just any artificial intelligence – one that understands that a brow can indicate surprise, but conveys concern from a different angle.

In other words, she knows how you feel.

At Hong Kong True Light College, this scenario is not in the future. Artificially intelligent, sensitive computers track students and see what tasks they are struggling with so they can get help from teachers.

Artificial intelligence (AI) has decoded the information in our faces and voices for years – take face recognition on social media or digital assistants like Siri and Alexa. But now computers are also starting to respond to our emotions.

Computers that read our emotions can help us drive more safely, relieve stress and much more – and be used for manipulation.

The face reveals a lot

A traditional computer program processes data according to a specific formula and spits out a result.

Artificially intelligent algorithms learn from the data they compute with. For example, an AI algorithm can learn to recognize a face in an image.

This allows you to unlock your phone by showing your face where others would not be able to with your phone.

The underlying technique is artificial neural networks because they are modeled after the network of nerve cells in the brain, the neurons.

Each ‘digital neuron’ calculates pixels on a photo and sends the result through the network, finally comparing the analysis with a photo stored in a database.

For each image the network analyzes, it becomes “smarter”. In the end, for example, data no longer has to follow 100 percent the same path through the network to produce the same result.

With this technique, machine learning, for example, an algorithm can recognize you from different angles.

Sensitive computers follow the same logic, but look for a display of emotions. That is why the concept is also called ’emotion AI’.

Our most obvious emotional marker is facial expressions. Each movement of the facial muscles is compared to a register of movement-emotion combinations, the artificial neural network analysis tool.

Emotion AI is therefore the digital counterpart to human empathy.

Radio waves indicate anger

Emotions often run high in the car – just think of the so-called road ragean outburst of anger during frustrating traffic situations.

The American company Affectiva has developed a sensitive AI to curb emotions that flare up in motorists.

The AI ​​tracks drivers through cameras in their rearview mirrors. If the facial muscles tense and the arms shake, the driver is given a warning so he can pull over and relax.

And it’s not just body language that reveals emotions.

Your mood may also follow a similar signal as your wireless internet. Researchers at MIT have built an emotional AI device, EQ-Radio, that emits radio waves – just like your Wi-Fi router, just with slightly different wavelengths.

When the radio waves hit someone, they pick up information because breathing and heartbeat affect the shape of the waves reflected back to EQ-Radio.

The researchers behind EQ-Radio hope that the technology can help people detect signs of stress. In this case, EQ-Radio can be automatically programmed to dim the light and play soothing music.

And the above examples are not far off – the technologies already exist.

Sensitive AI, developed by Opsis, is used in Singapore, for example, to help diagnose 4,300 elderly people with possible anxiety, stress and depression. The AI ​​analyzes, among other things, video images of the elderly’s facial expressions. And that’s just an example.

Sensitive computers help autistics

In 2012, the Google Glass glasses came on the market. The glasses were banned in many cafes and bars because the video feature invaded the sense of privacy.

It was never a huge success, but researchers have identified a number of areas where Google Glass could become a useful tool.

And this is especially true when programmed with sensitive AI.

Researchers at Stanford University equipped the Google Glass glasses with a sensitive algorithm, after which autistic children began to use the glasses in everyday life.

The algorithm detects faces in the surroundings and lets the child know what emotions people are feeling via an emoji-like message. It can help autistic people behave socially, which is difficult for them because autism often presents great difficulty in interpreting the feelings of others.

So sensitive computers can make life easier. But they also have flaws.

Psychologist Lisa Feldman Barrett from Northeastern University in Boston, USA, shows in an article from 2021 that one facial expression can indicate anger, joy or another emotion, depending on the situation a person is in.

So our emotions may not always be directly related to a particular position of the facial muscles. A sensitive computer can misinterpret facial expressions, Barrett says.

And even a computer analysis that hits the spot can be based on a biased worldview. A 2018 study at the University of Maryland in the United States shows that sensitive computers often attribute more negative emotions to dark faces than bright faces, even though they show the same muscle movements. Emotion AI can therefore race analyze.

The explanation is simple: The algorithms are developed from data – photos of faces and associated descriptions. Racism is everywhere, so it also affects data.

Although emotion AI was completely free of bias, the technology could be used in more or less manipulative ways, for example by companies sending feel-good tailored advertising to people just when they are down and need to be cheered up. Or political parties that respond to people’s fears.

But whether emotion AI becomes a tool for manipulating people’s emotions or a technology for transforming computers into compassionate friends, we will deal with it in our daily lives.

As emotion AI researcher Rana el Kaliouby said in a 2015 TED talk: “All our digital devices want an emotion chip, and we will not remember what it was like to frown on our device without saying, ‘Hmm, it did. ‘did not like it, did you?’

Leave a Comment