Amazon’s Alexa Becomes More Emotional to Offer Intuitive Voice Experience

by December 2, 2019

The voice assistant by Amazon, Alexa is about to get more emotional. Recently, the company has announced that its digital assistant will soon be embracing a lot more feeling into her responses. According to the company, Alexa will be able to revert with high, medium or low degrees of the two emotions. Amazon said, “customer feedback indicates that overall satisfaction with the voice experience increased by 30 percent when Alexa responded by emotions.”

This feature will enable developers to make Alexa respond to questions from users in tones or emotions including happy, excited, disappointed, or empathetic tones. Amazon considering the latest development as new capabilities of Alexa, cited that the feature will help develop a more natural and intuitive aural experience for users.

Amazon has revealed that emotional responses are relevant to skills in gaming and sports categories. Furthermore, according to the company, one can also have Alexa respond in a speaking style that is “more suited for a specific type of content.”

Besides emotion-driven responses, the company also added another way of speech to Alexa which is a DJ-style voice. It has been designed to be used while discussing music.

In her blog, Amazon’s Catherine Gao quoted, “emotional responses are particularly relevant to skills in the gaming and sports categories. Additionally, you can have Alexa respond in a speaking style that is more suited for a specific type of content, starting with news and music. Speaking styles are curated text-to-speech voices designed to create a more delightful customer experience for specific content. For example, the news speaking style makes Alexa’s voice sound similar to what you hear from TV news anchors and radio hosts.”

In the post, she also highlighted the functioning of Alexa’s new emotions. It says that Alexa emotions employ Neural TTS (NTTS) technology which is also known as Amazon’s text-to-speech technology. This enables more natural-sounding speech. For example, people can make Alexa respond in a happy or excited tone when a customer answers a trivia question correctly or wins the game. On contrary to this the digital assistant can respond in a disappointed or empathetic tone when a customer asks for the sports score and their favorite team has lost.

As the company is making its virtual assistant more engaging, the newest addition in Alexa’s features has made it more pleasing for people to talk to.

However, applying emotions to voice technology is not limited to Alexa. Even the automotive industry has undertaken certain projects that can determine that technology can identify the emotions of a person and respond to it in the best possible way. For example, Nuance Automotive, showed off its own technology at the CES 2019 event back in January. The company is now independently known as Cerence.

Additionally, Toyota, Kia and General Motors have undergoing pilot projects for emotional AI for cars. Now with the latest development by Amazon, the auto companies can lead to making Alexa match how it speaks to the emotions of people around it.