Illustration of two heads, one human and the other a robot or machine.

 

This is part 4 to our 5 Part Series on Empathy and its relation to artificial intelligence, machine learning, and where we are headed in the future with technology. Follow the Kairos team as we explore empathy in the tech field and ways to implement it.

 

In our earlier posts we've discussed, and proven, empathys' growing importance in artificial intelligence. The next questions to ask are, "How do we make AI empathetic?", "How do we build emotion into our AIs?" and "Can we ever make AI feel?"

At Kairos, we believe the answer to the "how" question is in face analysis. Facial recognition allows software to identify and verify human faces while emotion analysis allows software to measure and read the emotions on those found faces. More importantly, facial recognition and emotion analysis looks at each user as an individual and captures their specific human data.

I am Me, You are You

When you meet a new group of people you typically introduce yourself to each person, trading data if you will, learning their names as they learn yours. It would seem strange if you met a group of people and they announced themselves under one name, even a band or a sports team would tell you individually who's who. We are all different and we want to be recognized as so.

Creating successful, empathetic, human-interactive AI means the AI needs to communicate with us at that same individualistic level. Human communication has to be processed as more than just data and numbers. It has to convince us that our AI experience really understands each one of us and our needs.

Crazy or Genius?

You may be thinking that using facial recognition and emotion analysis sounds cool but may be too sci-fi of an idea. In Part 3 of our series we looked at a few programs that have already started to use these tools to advance their AI.

For those of you who don't want to jump to another article here's a recent example: Car companies, like Honda and Toyota, have already developed concept cars that read your facial expressions to better respond to you as you are driving.

There are also people like Murray Shanahan, a professor of Cognitive Robotics at Imperial College in London, who not only believes, but preaches, that we should be creating artificial general intelligence (AGI) to mimic our neurological makeup. He fears development of AI that forgoes human qualities will lead to a malevolent force with little regard for the morals and ethics that underpin our civilisation.

"Capitalist forces will drive incentive to produce ruthless maximisation processes. With this there is the temptation to develop risky things,"

- Murray Shanahan, Professor of Cogntive Robotics at Imperial College, London

So, as we can see the notion that AI fundamentally needs to have empathy (emotional intelligence) isn't just a neat idea. It's going to become a requirement.

How Does it Work?

Looking at your friends and family you can easily recognize and identify their faces to their names. You also have the ability to determine what someone may be feeling by looking at their facial expressions. AI can also have this capability by using facial recognition and emotion analysis.

Kairos' Facial Recognition API allows AI to recognize, verify, and identify specific faces and Kairos' Emotion Analysis API allows AI to determine if someone is feeling one of the 6 core emotions. These actions are carried out by looking for specific facial features, patterns, and measuring them against trained data about the human face. So for example, if you have an AI program at a hospital working with different patients it will be able to verify and identify each patient and tell you the emotion they are feeling. You could even program the machine to respond to the patients in different ways depending on the emotion the AI recognizes.

KAIROS ON HUMANIZING AI

So what do you do if you find that your customers are skeptical about the use of machines or are afraid of AI?

At Kairos, we ask "why?...".

We have found that most people are not truly scared of AI, but instead, do not fully understand AI and what it is. This, in our opinion, could be due to the robotic and machine-like stereotypes that are portrayed in the media and entertainment industries. Movies like, Minority Report, 2001: A Space Odyssey, Ex-Machina, and The Terminator are all examples of technology being used in a negative way that creates fear.

Image of HAL 9000 from the movie 2001: A Space Odyssey

So, for customers to accept AI as part of a brand's experience, they need to understand what AI is.

Artificial Intelligence isn’t the only buzzword being thrown around, the others you may have heard are machine learning and deep learning. Here is how we simply define these terms:

Artificial Intelligence: Computers/machines simulating human intelligence and/or behaviors.
Machine Learning: A type of AI that allows computers to learn on their own when exposed to new data.
Deep Learning: A branch of machine learning based on set algorithms.

By breaking down the definitions, in simple terms, you can show your customers that this technology is in place to help us, to allow us to be better in control and not the other way around.

This does not mean that AI feels empathy or even understands empathy as humans do. Empathy, for humans, is a trait that is taught and learned through personal growth and experience. Programmers are still trying to figure out how AI can experience emotions like a human but for now AI can learn the meaning of empathy and the patterns in micro expressions that would lead to an empathetic response from it. This means that while AI can utilize facial recognition and emotion analysis, it is still up to the team of developers to decide how to use it and how to teach the deep learning algorithms to learn from it continuously.

To create an even better empathetic AI program or machine, developers can blend facial recognition and emotion analysis with other measurements. Two examples that come to mind are body language and voice pattern measurement. Just like our face gives off clues to our emotions, so does our body and voice. The way we sit or the volume in which we say something could potentially help AI verify that the micro expression it's reading is in fact fear, happiness, or sadness. Combining these approaches continues to humanize AI, and gets us closer to true harmony between Man and Machine.

So What AI Can You Build?

We're seeing teams and companies using facial recognition and emotion analysis to build empathetic AI programs to better serve humans and to help people open up more about their feelings in different situations. Other companies in Healthcare, Automotive, Marketing, and Education industries have also started to invest in empathetic AI to better serve their customers and business needs. There really are no limits.

We've answered the what, the who, and the how now the only question that remains is, "When will you create your empathetic AI experience?"...

Talk to an expert today - contact us.

 

Part 1: What is Empathy?

Part 2: Empathetic Machines Creating Jobs?

Part 3: Impressive Artificial Intelligence Using Empathy Now

Part 4: Is Facial Recognition and Emotion Analysis the Answer to Empathetic AI?

Part 5: Empathy in AI Series: Part 5, The Future of Empathetic AI

 
 
 
 
 

#DiversityRecognition

Diversity Recognition by Kairos

Get your ethnicity makeup today.

TRY IT FREE

DEVELOPER?

Analyze faces in your products.

PRICING & SIGN UP

NOT A DEVELOPER?

Transform your business with face recognition.

LEARN MORE