Article Image

IPFS News Link • Robots and Artificial Intelligence

Robots That Know If We're Sad Or Happy Are Coming

•, By Masha Borak

New York-based startup Hume AI has debuted a new "emotionally intelligent" AI voice interface that can be built into different applications ranging from customer service and to healthcare to virtual and augmented reality.

The beta version of the product named Empathic Voice Interface (EVI) was released after securing a US$50 million series B round of financing led by EQT Ventures. Union Square Ventures, Nat Friedman & Daniel Gross, Metaplanet, Northwell Holdings, Comcast Ventures and LG Technology Ventures also participated in the funding.

The conversational AI is the first to be trained to understand when users are finished speaking, predict their preferences and generate vocal responses optimized for user satisfaction over time, Hume AI says in a release.

"The main limitation of current AI systems is that they're guided by superficial human ratings and instructions, which are error-prone and fail to tap into AI's vast potential to come up with new ways to make people happy," the company's founder and former Google scientist Alan Cowen says. "By building AI that learns directly from proxies of human happiness, we're effectively teaching it to reconstruct human preferences from first principles and then update that knowledge with every new person it talks to and every new application it's embedded in."

The voice model was trained on data from millions of human interactions and built on a multimodal generative AI that integrates large language models (LLMs) with expression measures. Hume calls this an empathic large language model (eLLM) and it helps its product to adjust the words and voice tone based on the context and the user's emotional expressions.