This Giving Tuesday, Investor Dr. Rush is matching your donations to Marketplace - can we count on your support? Count Me In! ✔

A tour of “emotionally intelligent” AI

Matt Levin Jul 16, 2024
Heard on:
HTML EMBED:
COPY
Hume AI believes its AI voice product, which can recognize 48 human emotions, could be put to use as a personalized virtual assistant. Ilya Lukichev/Getty Images

A tour of “emotionally intelligent” AI

Matt Levin Jul 16, 2024
Heard on:
Hume AI believes its AI voice product, which can recognize 48 human emotions, could be put to use as a personalized virtual assistant. Ilya Lukichev/Getty Images
HTML EMBED:
COPY

Back in April, the startup Hume AI released a demo of what it calls the first emotionally intelligent artificial intelligence voice: “EVI,” short for “Empathic Voice Interface.”

Talking with EVI is a lot like chatting with the one of the newer generations of AI-powered voice assistants — think Siri or Alexa, but with ChatGPT capability. You can ask EVI to read you a poem or explain what caused the French Revolution, and a soft, mild-mannered voice will recite a haiku or start a mini lecture on eighteenth-century France. EVI also has the bugs and lags that you frequently encounter with AI voice bots.

What makes EVI different is that after you ask for a poem, a list of nuanced emotional expressions the AI claims it is detecting in your voice appears on the screen, from awkwardness and confusion to contempt and surprise. Hume AI says it parses up to 48 distinct emotional expressions.

“Those 48 go way beyond what emotional scientists have historically have studied because we can collect data at a much larger scale to train these models, and we can pick up on much greater variety of nuance,” said Hume AI founder Alan Cowen.

Hume’s empathic AI is trained on podcasts, other media, and recordings from psychological experiments they conducted. The company also sells AI that analyzes facial expressions, all of which promises to help companies and chatbots better understand their customers. EVI costs about $0.10 per minute, although it will vary by client.

“We have customers in customer service, like Lawyer.com. We have big tech companies using different technologies we’ve developed,” Cowen said.

Lawyer.com uses Hume to improve it’s 1-800 line. Call centers are a natural fit for a technology that recognizes human frustration.

But Cowen has bigger aspirations: personal AI assistants that can really comprehend what you want, optimized for your well-being.

“It learns from you over time, so it’s more personalized to you,” Cowen said.

AI trained on your voice and facial expressions could one day tell you, “hey, did you notice you get kind of tired and irritated around 3 P.M. every day?” Which sounds helpful, until a beguiling robot voice gently reminds you Starbucks Frappuccinos are half off until 4.

“We are more vulnerable to making unnecessary purchases at certain emotional states and certain times of day,” says Ben Bland, who helped develop industry ethical standards for empathic AI at the Institute of Electrical and Electronic Engineers.

He also worries about AI doing to our emotions what the smartphone did to our attention spans.

“If you have a computer game that adjusts its settings according to some estimate of how excited and how much you’re enjoying the game, you could develop an addiction towards the game,” Bland said. “You could develop emotional desensitization.”

Of course, all of Bland’s nightmare scenarios presuppose that a robot can actually infer how you express emotions in some scientifically reliable way, which is still a matter of debate.

“I don’t think these technologies work as well as is sometimes claimed by those who are marketing these,” said Andrew McStay, director of the Emotional AI Lab at Bangor University in the U.K.

McStay pointed to studies of existing emotion recognition tech that assign more negative emotions to Black men’s faces than white men’s, perpetuating harmful stereotypes.

But he said there’s also a broader question about what emotions really are and how much variation in emotional expression exists between people and cultures.

“So, in terms of what is often claimed about these technologies, that there is a biological program that can be discerned, and we don’t look towards the social and we don’t look towards the cultural, then that is problematic,” McStay said.

There’s a lot happening in the world.  Through it all, Marketplace is here for you. 

You rely on Marketplace to break down the world’s events and tell you how it affects you in a fact-based, approachable way. We rely on your financial support to keep making that possible. 

Your donation today powers the independent journalism that you rely on. For just $5/month, you can help sustain Marketplace so we can keep reporting on the things that matter to you.