Deep Dives

How emotionally intelligent AI can improve the way humans interact

Cheer 0
4 min read
Gif of drivers being analyzed using Affectiva software
Affectiva automotive AI analyzes facial and vocal expressions to identify expressions, emotion and reactions
Courtesy of Affectiva

C omputer scientist Rana el Kaliouby is the CEO and co-founder of Affectiva, a fast-growing emotion recognition software startup that spun out of her work with Rosalind Picard at the MIT Media Lab.

We recently spoke with Rana about what led her to explore human/computer interaction, what she’s learning about emotion and artificial intelligence, her advice for other female founders in tech, and more.

What gave you the idea to work in this area? What problem—or problems—are you trying solve?
When I was doing my PhD at Cambridge University, I was constantly using my laptop to communicate with my family back home in Egypt. I realized I was spending more time interacting with technology than with any other human being. And as I communicated with my family, they had no clue how I was feeling except for the smiley or sad face emojis that I could send them.

That’s when I had this “ah-a” moment. All the nuances of my emotions were being lost in cyberspace! But what if technology and our devices could understand us in the same way that people do?

This has driven my mission to build Emotion AI, and now Human Perception AI. As our relationship with technology is becoming more relational and personal, artificial intelligence needs to be able to understand and interact with us in the same way other humans do. AI needs not only IQ, but EQ—emotional intelligence—as well. At Affectiva, we believe that by giving machines the ability to understand all things human, it has the potential to improve the way we work, live and interact with one another.

Affectiva develops software that understands people’s emotional and cognitive states by analyzing their facial expression. Can you explain the science behind this? What is it possible to know about someone based on their facial expressions? What can’t you learn from facial expressions alone?

Affectiva’s Human Perception AI is built on deep learning, computer vision, speech science, and massive amounts of real-world data. Specifically, we use an optical sensor such as a smartphone camera or webcam to identify a human face in real time. Then, we use computer vision based deep learning networks to map the texture and movements on the face to a number of facial expressions, such as a smile or smirk or a brow furrow. These facial expressions are like building blocks that combine in many different ways to communicate a wide range of emotional and cognitive states—from joy and surprise to confusion and attention.
Only 7 percent of how people communicate their emotions is via words—the rest is non-verbal
While facial expressions can give us insight into a person’s emotions, our ultimate goal is to develop multi-modal Human Perception AI that can understand people in the same way that humans do, based on an understanding of not only facial expressions, but the other channels people use to express emotions, including face, voice, and gestures. Only 7 percent of how people communicate their emotions is via words—the rest is non-verbal. At Affectiva, we started with the analysis of facial expressions of emotion, and have now added capabilities to analyze the voice acoustic and prosodic features such as tone, loudness, and tempo.
Rana el Kaliouby talks about her company, Affectiva
What kinds of data have you gathered from your software, and what does it tell us about human emotions? What doesn’t it tell us?
To date, we have analyzed more than 8 million faces in 87 countries around the world—making it the world’s largest emotion data repository. Based on this data, we are able to train our algorithms to identify key emotions such as anger, fear, and joy. This data also gives us more insight into the different ways that people express emotion. For example, one study found that women are more likely to express positive emotions such as smiling, while men show more negative emotions associated with anger. Specifically, women smiled 32 percent more than men and their smiles were longer in duration. Meanwhile, men express negative emotion primarily through anger, showing anger 14 percent more than women.

That said, there is always the opportunity to learn more. We are constantly collecting more data to create a robust dataset that is representative of different demographics and use cases that the AI will interact with. This is incredibly important not only for the technology to work, but also in order to avoid algorithmic bias and deploy AI that is ethical.

The Affectiva homepage declares, “Affectiva human perception AI understands all things human.” Not even humans necessarily understand all things human. Can you explain what you mean by this?

We have spent the past few years defining the category of Emotion AI, or technology that can understand human emotions. But as people interact with AI in more and more settings, there’s much more that should be understood than just emotions, so we’re pioneering Human Perception AI—software that can detect not only nuanced human emotions, but also complex cognitive states, behaviors, activities, interactions, and objects people use. This is indeed a bold vision and we are in the early days of developing Human Perception AI. But it is an exciting and ambitious mission—one that will not only transform how we fundamentally communicate with our devices and technology, but also with one another, since so much of communication today is mediated by technology.

Your work explores what makes a machine emotionally intelligent. It seems that a lot of actual human beings lack emotional intelligence! Do you see potential for machine learning to improve our own relationship to emotion?

I think there’s definitely the potential for AI to improve the way we interact with one another and re-inject emotions into our digital interactions. When we interact with people online, we are missing all of the nonverbal cues that come with a face-to-face conversation. One area that comes to mind, in terms of the impact, is with online bullies—I think people are emboldened to say things they wouldn’t have said in real life, because they don’t see the effect they are having on people when they send a mean message. My hope is that, by incorporating the emotional element of communication into digital interaction, we can start to take steps toward more empathetic, humane and authentic online communication.  
Human Perception AI will enable cars to optimize the ride based on who is in the vehicle, their mood, and how they are interacting with others in the vehicle
Affectiva currently works in two arenas: automotive and market research. How do you determine which areas to prioritize as you deploy your technology? Is there another area where you’re looking to expand?
There are tons of potential applications for Emotion AI across industries such as market research, advertising, healthcare, robotics, automotive, education, and more. Market research was one of the areas where we quickly saw commercial interest after spinning out of MIT Media Lab in 2009. Through our partnerships with leading market research firms, 25 percent of Fortune Global 500 companies use Affectiva’s technology to gain deeper insights into audience reactions to content and optimize campaigns and media spend accordingly.

In the last couple of years, we’ve also seen a lot of interest from the automotive industry. Human Perception AI has the ability to completely transform the transportation experience when it comes to both road safety and the occupant experience.

For example, cars can be built with advanced driver state monitoring solutions that can detect signs of dangerous driving behavior such as drowsy or distracted driving. And in the event that dangerous driving behavior is detected, the vehicle can alert the driver or intervene. This will become especially important as semi-autonomous vehicles become more prevalent, as the vehicle will need to be able to determine if the human driver is ready to take back control of the wheel.

As far as the driving experience, as automated mobility comes to fruition, it will no longer be about who is behind the wheel. Instead, there will be a newfound focus on occupant experience. Human Perception AI will enable cars to optimize the ride based on who is in the vehicle, their mood, and how they are interacting with others in the vehicle and the vehicle’s system itself to create a more comfortable and enjoyable experience.
You and your co-founder are both female. Have you encountered any barriers that you would attribute to being a woman-led technology company? If so, how have you overcome them?
Inequality exists in nearly all corners of our society, but the tech industry in particular is still a "boys club" in many ways. Women make up just 26 percent of tech jobs, and hold only 11 percent of leadership positions. Equally concerning is the fact that female-founded companies receive less than 3 percent of all VC funding, and that only 8 percent of partners at top 100 venture firms are women.

As a female founder and CEO, this is something I've experienced first-hand; and, I feel a responsibility to use my platform to enact change. A few of the strategies that I’ve used to overcome these challenges, and that I encourage other women in my network to pursue, include:

  • Promoting financial literacy for women—teaching women these skills early on, in academic settings and also as we mentor other young women.
  • Creating our own networks and opportunities—find other women (and supportive men) who can advocate for you, and who have experience navigating challenges you may face in your industry. These connections can translate to a network full of people you can tap into for new job opportunities, advice or mentorship.
  • Closing the confidence gap—being assertive when talking about your beliefs, your company’s growth, your technology’s potential, or the like.

female-founded companies receive less than 3% of all VC funding
Since Evoke is a community of optimists, we like to ask everyone: What do you do to maintain your optimism?
Running a startup is definitely an emotional roller coaster, but there are a few things that keep me optimistic:
  1. I always go back to the “why.” For me, it’s Affectiva’s mission to humanize technology, and why we’re doing it—which is ultimately to improve people’s lives.
  2. Mental strength and optimism are very much tied to physical strength. I insist on staying active through jogging, weight training, and yoga, but my absolute favorite is taking Zumba classes.
  3. Paying it forward—nothing brings me more joy than making time to mentor a young entrepreneur, or help a fellow female founder. One of the cool ways we do this is through Affectiva’s internship program, which features a really diverse group ranging from high school to under-graduate and graduate school students.

Posted: July 9, 2019
Edition: Data/Driven