Autodesk is rolling out a new version of its Autodesk Virtual Agent (AVA) to replace its text-only chatbot in mid-May. AVA features an animated face and a New Zealand accent, with emotional responses that the company hopes will help customers engage. For example, if a customer says he’s having trouble, AVA will frown, say she’s sorry and ask how she can help, explains Autodesk manager of digital support channels Rachael Rekart. In doing so, Autodesk is following a trend of companies installing virtual assistants that are more helpful and personable. Amazon, for example, is updating Alexa to be smarter and more conversational.
The Wall Street Journal reports that another example comes from United Services Automobile Association (USAA), whose Eva, launched in 2012, “understands enough about USAA’s banking and insurance products and its customers to present likely options that will steer conversations in a productive direction.”
AVA will “be able to detect and react to users’ emotional state,” says its developers at Soul Machines in New Zealand (watch Soul Machine’s “Creating AVA” video on YouTube). “We’re going to spend more of our time interacting with automated systems like robots and self-driving cars,” said co-founder Greg Cross. “Our view is that these machines are more helpful if they can engage with us and respond to us.”
IBM’s Watson Assistant technology underpins AVA’s ability to understand language. AVA “looks almost realistic enough to pass as human, though her voice doesn’t quite synchronize with her animated mouth” and a tongue was later developed to create more realism.
Chatbots, or digital helpers, were largely inept until the 2014 debut of Amazon’s Alexa in its Echo speaker; “Alexa set a new standard in voice recognition with her ability to distinguish commands spoken across rooms and amid noise.” Alexa still has limitations, as its “skills” are based on a limited number of specific consumer transactions, but does well with these.
The same is true for USAA’s Eva, “which handles around 70 percent of questions it receives without passing them along to a human agent,” according to the company’s chief digital officer Chris Cox.
The Verge reports that, according to Alexa Brain group head Ruhi Sarikaya, “U.S. users will soon be able to ask Alexa to remember important information that can be retrieved at a certain date,” such as birth dates. Google Assistant “has had a remember feature for some time, which can also display a list of the last five things you’ve asked it to remember.”
Alexa will also “be able to understand conversational questions asked back to back without the need to say ‘Alexa’ every time,” a feature available first in the U.S., U.K. and Germany. Finally, U.S. users will “be able to discover and engage with skills simply through natural conversation,” which could open up branding opportunities for companies “to create skills to best match popular consumer queries.”
Google Assistant Is Smarter Than Alexa, Study Finds, CNET, 4/27/18