A caller “empathic dependable interface” launched contiguous by Hume AI, a New York–based startup, makes it imaginable to adhd a scope of emotionally expressive voices, positive an emotionally attuned ear, to ample connection models from Anthropic, Google, Meta, Mistral, and OpenAI—portending an epoch erstwhile AI helpers whitethorn much routinely get each gushy connected us.
“We specialize successful gathering empathic personalities that talk successful ways radical would speak, alternatively than stereotypes of AI assistants,” says Hume AI cofounder Alan Cowen, a scientist who has coauthored a fig of research papers connected AI and emotion, and who antecedently worked connected affectional technologies astatine Google and Facebook.
WIRED tested Hume’s latest dependable technology, called EVI 2 and recovered its output to beryllium akin to that developed by OpenAI for ChatGPT. (When OpenAI gave ChatGPT a flirtatious voice successful May, institution CEO Sam Altman touted the interface arsenic feeling “like AI from the movies.” Later, a existent movie star, Scarlett Johansson, claimed OpenAI had ripped disconnected her voice.)
Like ChatGPT, Hume is acold much emotionally expressive than astir accepted dependable interfaces. If you archer it that your favored has died, for example, it volition follow a suitable somber and sympathetic tone. (Also, arsenic with ChatGPT, you tin interrupt Hume mid-flow, and it volition intermission and accommodate with a caller response.)
OpenAI has not said however overmuch its dependable interface tries to measurement the emotions of users, but Hume’s is expressly designed to bash that. During interactions, Hume’s developer interface volition amusement values indicating a measurement of things similar “determination,” “anxiety,” and “happiness” successful the users’ voice. If you speech to Hume with a bittersweet code it volition besides prime up connected that, thing that ChatGPT does not look to do.
Hume besides makes it casual to deploy a dependable with circumstantial emotions by adding a punctual successful its UI. Here it is erstwhile I asked it to beryllium “sexy and flirtatious”:
And erstwhile told to beryllium “sad and morose”:
And here’s the peculiarly nasty connection erstwhile asked to beryllium “angry and rude”:
The exertion did not ever look as polished and smooth arsenic OpenAI’s, and it occasionally behaved successful unusual ways. For example, astatine 1 constituent the dependable abruptly sped up and spewed gibberish. But if the dependable tin beryllium refined and made much reliable, it has the imaginable to assistance marque humanlike dependable interfaces much communal and varied.
The thought of recognizing, measuring, and simulating quality emotion successful technological systems goes backmost decades and is studied successful a tract known arsenic “affective computing,” a word introduced by Rosalind Picard, a prof astatine the MIT Media Lab, successful the 1990s.
Albert Salah, a prof astatine Utrecht University successful the Netherlands who studies affective computing, is impressed with Hume AI’s exertion and precocious demonstrated it to his students. “What EVI seems to beryllium doing is assigning affectional valence and arousal values [to the user], and past modulating the code of the cause accordingly,” helium says. “It is simply a precise absorbing twist connected LLMs.”