December 10, 2018
Amazon Music debuted a voice feature that lets Amazon Music Unlimited and Prime Music customers on Echo devices and in the Amazon Music iOS and Android apps converse with Alexa to find playlists and music for specific moods. The listener can identify songs by lyrics, among other features, and reject or “like” individual songs. Amazon’s overall aim is to allow each listener to create a more customized listening experience. Amazon is also in trials with a feature that allows Alexa Answers to be shared worldwide.
Variety reports that, with the Amazon Music voice feature, listeners will receive “a more personalized response based on a unique mix of advanced algorithms” when they ask Alexa to play music. Alexa will “ask several basic questions to identify what the listener wants to hear, including specifics on occasion, genre, tempo, or mood.” It will also “resurfac[e] forgotten songs customers haven’t heard in a while, or [play] new music from artists a customer has asked Alexa to follow.”
If the listener asks Alexa what to listen to, it will use “cues from previous listening habits” and ask questions about your “genres, eras and other preferences.” The listener can also tell Alexa “when they like or dislike any currently-playing song, album, playlist or station,” by saying ‘Alexa, I like this song’ or ‘Alexa, I don’t like this’. When the listener asks Alexa to ‘play music I like,” it will “play a collection of favorites, based on most listened-to or previously liked songs.”
VentureBeat reports that, “Amazon is making it possible for people who speak with intelligent assistant Alexa to answer questions and share information that may then be distributed to millions of users around the world.”
The feature is only available via invitation; internal trials began one month ago. So far, “more than 100,000 answers have been added to Alexa Answers.” Alexa Answers is intended to “address questions that have not yet been answered.” Users who want to suggest answers must do so through a website, not a conversation with Alexa, and their responses will be attributed to “an Alexa customer.”
In August, Amazon introduced “Answer Updates so that anyone who asks a question Alexa is unable to answer will get a notification when the assistant learns the right answer.” Earlier this year, the company introduced “follow-up questions, an attempt to make exchanges with Alexa more conversational.” Both of these initiatives are intended to “shore up one of Alexa’s shortcomings.” A study found that, “Google Assistant was best at answering questions, followed by Apple’s Siri, Alexa, and Microsoft’s Cortana.”
Other efforts to improve Alexa include recording every exchange with customers, “to improve the AI assistant’s language understanding.” Alexa engineers debuted Cleo last year, which “records exchanges with people who speak some of the world’s most spoken languages,” including Hindi, French and Mandarin Chinese. In 2018, “Alexa learned to speak several new languages.”